var/home/core/zuul-output/0000755000175000017500000000000015146423324014531 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015146434236015501 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000311147415146434123020264 0ustar corecoreS8ikubelet.lognc9r~DYA6ZF,-K$l"mklkcQӖHSd꿟 "mv?_eGbuuțx{w7ݭ7֫~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR׏!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\PϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/X_]F@?qr7@sON_}ۿ릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg'%hJKZ|Q;|m쇲=T u)1 QLLj`K -D,(7N*,< JDA?VǞ©H\@mϛ~W-ce{0d8 X]Տ޻(*exBaEW :bT:>%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&ޙ-did˥]5]5᪩QJlyIPEQZȰ<'Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI(ci,̭Q&s~IN/[sZJNȯ?@O$<ڞy4iuKVMٞM9$1#HR1(7x]mD@0ngd6#eMy"[ ^Q $[d8  i#i,F9$^@mdH֙toN1 < ҷBq/ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘTv|W*c\'љiC{҇7>y`|Q'eű^\ơ' .hvX%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00׍YɓHڦd%NzT@gI""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+K=sNӺU_$uS3㬎IXX692r$I~gJE{tQI·D)Km=μh, $tWWQ:?UlDHNh9}UNR1٣f?o"OZ@_G}($A]v4·g+T7Y{s!AZ`#e"& cM }IÅQIwlcL67N'c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v)vR/egBhAAdh]4H:nV$tHI98/)=mͭ ڐn}}~ק?g_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;_ XC.l.;oX]}:>3K0V|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p#;_|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰbSt=&yo|BL,1+tmgo 恤hˍJ+peJkQ޽b(LF/ bSULy8UA'3DZ笪|U--sG8`u#qGYܷwodK#i  *!Jg,JֿlP1ÒJG9TV\/B{MӨ&Ę4.s"x| ^J/ŵ_iݼGw eIJipFrO{W7>^.hd/}!1Nſ4af祋,˂/^8\vRO0P'֫-kbM6Apw,GO2}MGK'#+սM^dˋf6Y bQEu_}G7Z/qCޯ'+HDy?\~Ȳ=sXअy{E|/yJg&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэo'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r-JAT6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁV_wmm#K~ٗP"y ``'c8d &ٔHٞ_U$%K5bbJ$Uuuu]Uq=J"".,DUaRY/E})JY쫒d4W5E0 1W]xD Rwdq)7orƾ*J)\PIv )bT뢬T=K.Q!R$ S&B3j>9J}@<82,o;<%;:%'nƗ{F`kˌ~|- [.,É#ŕ 3ssuw/ \uq-C7^W1Ëvl نXm mn;QhF"Ӥ5fr }m1D\5= +Zm/Aw7KeY;dp&`/˗["4]CA1 ħ.h4BCtL?`k ]8V?Gӹ' SӵւתxRȐ.{xsp>ά ]0B/t[c QLV_EׂV{,=PQ1ƀI \oU&wl;Bh?dOzmV_>``yc8~d3̱=R+G_zc-=d1[^gsgk{BohLaK]_gY,i9jbG,p7lL 24DԊ%4NT""IPƍ^碚&*|1.rwmbah;xZ^dmTY'Fd:-"u%M#0r7/3M p=װet"cQ`x.k'$E2!n:?LGLB [luY$Y  ȴ*DlibCa`Z  -ӑX Px.5y"@D(90%*g\bUYV' >9B.b1fr%PX2A:?wʖzE2K9UU^0 I%rҬ%oIU,{#(ʊbjA`񳥃|MJUha]4a QJIgmw(Y{̢jǣ?$Ar-@LbU %n V_iZW1=\P~EE_P5=b* [FQ(|;<2)Higף~|F@ |ymOt:̵G?[fRfpJ "~dA#:FK̖rƓv֪PGAÝqXqIx1G9)Ӵ9ooGGK O|NbPNfsGg'=fHP"/bДyr*"J Y]ɣp qJ9`IP|^X& KwFph{"a1rZ'U|T$T(l{(_,q{,ԹH3>`+ŗ\.ILIfﲬj꒺f=կKco|<>,}5{O/)jv<4 ﳬ\kQl*NHqt ƓwIV@rI;Hp = !07GI _uI0Y[CE$9#/NI FM5L$т8X^Q>\yHy_4Q8Y^; m 2-dqX5<^ ^`бae׸jjxB-~+ŏG C/k(Z|Bzo!I;ӱ &! qH@K4t9MՔ!g12K,OIu1g2"YF3CBpOgU{Fv6helI&“F4'%%?bJcۤ{Yˠ.@  P55  FPi!6&8hښw{l{o<6h*8U w"i?uV B^҃t(+6qH{߳0d ϟSl({{P9:D.!$0C>MxSW:n ci XXuk06Z $(aKB騮9fIPJ=6{䛂\ q8MeHK^E(""=<>?i30!IuR, \hcIp6q󴠠05 fCu^V$,Xs^RO 2}"ɗ$ io|,:?OE^β e>;TV;rTʴr"Y^)YY5yO{ލX۟JuyujC.i\(+ϓ](eމ6ۑ0čsx0u%CNKIHT o]Ge7.5|6<2ڼw\ђxWNss*v%Y]D")i"׫Q5Z% ! pÃJ4F|=u&3(,(,:S 8~UZ6w&y KԝL u4(N(eFH .On?Y/?* eLQ~]:-lÃeD{-`>J ~-+8ƃ$L! l5F;*jYjK2>GfY*jM)kpl' gQYsT*Łm?i\(yR Dˬp| 4@/4Wos fo Z* nE+rf4}zMI$NCRZ RCOlă8&& =CE0ʊ@j)M-Hu/й뢤'Vn>ԾI^ l` eB:路E`*5xK[vOb$j[}w0^'"Svrc&.V"-wjfmF};<{3YfeCWQ+ \s8aVyl.&'VZiVȕTÜgiz:s$0=O# 4"']ha ªX^\CROcf؊+)iXR*TM1P|:V )Ԍh.6 m{i.奔9zv1B$PS%Ig".hWMpmA%&)m5$q`5UJ e%7 C#r?53T^j!mm(k4oGz PK"+nՌ?kf*O6><).[Y"hcML]4jccvq>D^% Em5[˱Ǿ6x{|>dE #a=)7Ѭ$=gЭPTrݵflʲVXȣL쮼JCM'Gk8G; rXJJIALu! vtnL)~ʡMi":eirʔ8Mz_e0aׇtUU<~wy }Ё%})["1OEl?kj&uk=z"9OW]kJ]}|y"gh~vF)VKmEb=-n{qxi(4KAR__e'0q a6'ƫ _du5@`)ed|X "wa2N!wVܲ>.xQPs]aqxԃo]ݵiEY^ex  ,l~t0\WPnqc,_%&蒶,B0g s#!&͜]u[& 6r$dP7k:.]ۀ \C72ɹ,nO7ּ a3Tٛ;sC˩}ʛ^YчX}Wψ,7& J+R9T.H.o5if*˳$ D7,w+" #me!-^:t{mACiN;bnZX܁[g$"Ln$ !jucGc7G)[;ZۄOu>^2nv̗w79 (B> qꥆ68vm$"6wLEFܵd1;›/?F qL/D 3{u玻+#v`q Gweܗ8(S:wcn2bG"y6E:- k#yh֏g.iOIvݘEX@R4զ·<|3 ;\,N;!L0__|oH8ðOMٲ"U$ޕ6+Bx}>D:A&I1t#%9Ѵ-Iv o)*mXɯxY_([xw x9ؔ52PSye6R"++kkυ,BB((;QX^Y{u\'O`FsӎYa܂Cs(9qt}yD* ;y+0G}0-B)=|5TÅBu=:=$ˇ.1t k:\ĿC|kJ<;ZEo7(ȣ^̧h/=0j C8Eo(3"<0~!Zڤ$j@=T[Fj37p/Fz8?^o?"ok`ThM\aa  نvD!\kka_Qd6[ݿx^Zu˦z zv8NH I R)ev: t4RdjP9B>4ےPRVf3΋ TȤ ]뚳*q.8L$'<}PפB4h5Zs>w\/M$x&T0avyC%n0MܪF tJqEqo0M[ʮnt Ft˷ _w߹Wag!\2եFWR_w|zB1EjZp)mι0(s,^ҦTzL$֛0ຜ`402FXo )7`n&,V;~?|q*E%&35mp&"y8ڋ,g\_0;VLy6&S$ 4VDxJN`veΠZNE>kٴ}MMvz3Qo3f⍢/wWsG.ϚO"P<,Qf϶x>~V(=#VE*Fˑ^f7tiڭݿC[L" C @R.qx~B(ƚnH&xZIGkU[u)8ף%<쩇*kA+ ~@+a0T̓P"`z4=:9%ut};vа.#;4 C;6,G]. bxu8]:^1`љ n4C(‚h8Ό,+thEEtK>+EgKOq.o'iӊ_]'r_'$OÀL`6 2߇7R *e6U>ÌmΆ-gJZ.a paUMx[ާe)ZCpm9+oˠRuaꧾ|>Kft'<^FFy1 @ؑaiڶPdwg30%gWi'G8ƒ#q>9Dc%c۸.?p8] [@KNI?"%3@,F ,㞸OӓtO2QSlYv h՟7ߟ7 ՙ9(.,V]PEuuێ5tY1 \@8Q7Q"'(z2Gtd$#szh]^w-ߍi;H Jsb6OY~fn3 ;')8V$<9(+q(:=lzz5KyI.K*ʵ`5Kd;n) 'E c:ԟ en\ֵIV@9a[H?I}b12ˠqKihTv_@i0sO |RKQ.|S0״"9(~aEq4y7~~E4G4o5bn$)k6Q%BNABϷ,0$_Nia90w)K׊Ƥ4toOڃ,KVS3^NM7f7 ,1H-Ӈ1|-]A랯m.ž;.eCl[ewET&.u_e6`+$WuJ,e!7`[MV1G>wاPg uӣ |*Ê|`Wi6Yq`ٔЄBӀeR+Ay*. Jw.(}]]iRK4*}eҗ6+ݩ]Wv}IӾm_ܸ,x˂;j.:A j ([ (]P:A e;j- j ^(ڻ jNP;, u^( NP;- ^(AЅ\|8x%BY[H 4O ceq!ϏgT8hYu]y:S=hL%Et`rn7pO$!`c=kD;n8|ռ v|e<*<6qꡥ g=k{pk,H8TTϟC1 0͓"&t2.nc='H87v WQkaϯ(c0P,sEC.q F< PQ3@-L8#<˓ : ZVMPYo+BuvTQN`1 p8_!"~VTFs4 fEpf·hE|Fq8#*ڃӲj߃' >_5u{E*[}q+ICh^!xZBoIT̴?}: Cu-Rϗ,O?-.bcS*+`]>zU< *O xeQy (e!V: 2t}x'p˴MAL} @J1'ldKI"cxͨճ,dTy^qQ\)\5Ϲ<6T-xpFQ(}1\N2@hIz 4:~R.K!pwjс^W g.^R9 >&&)Zq,*'g`Ivr8GJcW ? {#K ҳV2l,:KC2I`fcg@ .:kׄ]WKgf>1]^*jW=R/Ψ(OB%0e5uSqK r [ ([Z-Kv^(eMP!QZ C ~&A/;$*Y؝ƳjDP$q_̷(& s&HOLҮijUʷvMf8;< (\jJ/)`>P@ј#iuH1 aQ5 &9HLWPx_xƒ1M5 f~޿aת̐q=0HJ1آRjR)ӡV&Sg;OxGUHt1|QUoplKǣ1|KZR3o~#E SHpla""qG{t}<#;Qaʕa'8.bf0+{ۿG1KKGrgǶKVe\MjBzr7B*=^R}KUJ-y^5i>./y\ m"^߫MgC>F.#:KZIݖU,*{ƴ^ 0OTҞI{V +XdFjPAUr-gH#ggR Ip-->YTe,>Jj|S}OA/2K2)FZ=KJ埫b;t/Z;m, >"0*Rct|5H{!P4SCl۬u֭iR:*+q+mS'EX$jp(OE|C'oUT*C#N#%RIR sZ Վy^]kY'r{'RF}5km&E:ǫQw K#IiiUQQ=Sbb@k.!ֻ-gY$Sآ-[!,oM9q2 ?x{F>IVgq|8!jJogK>Ŗd0<%].k44*,vjlɽ<1F|}ᒿYZ^aHOpF~k(Cx>|6͖[ :Ie ja0-$mq`5>|v?9?r*ҚD]^lVesTaBe$ Q0@eҞi@W(~l-I_^Ry~jo!ēޟ¦/^y$x F3|x ek h\ a6yy*'Aa#gSribEcU&]?>G i2I ~W?:'D8G1h%aKF>*ٝ:[#DX/"g|A5snB}nݹROd/` I[$<Epc0:<g;\ ^8O‡^A#~ɻF%+sryy?`\%Z&)[_?\i×4m,2c=kqa6~BN.7!3!LWsfO8č/'Z?#>̪ӏkryI'Ϲ_7!RO9 m߈8'_Ou}m6Sszg'_9*Hl}d_VlKp?iY76iU~ql~ǡqՔh(D \'g?V۟Δ@Isc,^p:?0AL4yh| !y>p gU)=!5a}9w=7YIP12;3LR3KQeiJ$=ۺ'ݯ NϷ'&,9w(P;?, vM~#ulN$ja٧2;tI8xHf@DEGWHMrT}A Hb- pN()KmUX[6!-F {ӌUmz!&ӎj.[UpUfIW 7B6!F~bg˴y?a3-aCHbG<C5agлܸujL蘗(yOg$v>a~)_Ip|VBz ( w1/ j(p`-AgEIs6fpa$3Euct"Z<ĪHN̊m=&CMC?+oQ r'Rp96`Z@\Qyka"H^{HI3c-3^ r: BΔClFE6YV 5T&0hvG SMfq}ILęץ1S`eL,F{~]^yBhU=ǤUtx+Y%cQfĈ!]-I^uMAE^|TĂ116e% tF̪pQ%K|$ d4# zl)E ^b5wJp6tdU\m!:$pIPex\f+K6'5FwZr!U|=R;0k^9k&֩EYg >OǞUS"kV2LYΰ񚏮y@8w#-e;/;YS @+:岍 Hbt/0 V*r$m1y|x9 %k#ڀ ^;x{{4<(7cP81p\~w{#UJ֒lL7oQfbe4b,F ;vM4 Iu7rÎa caȴHכM-_Qά`$沥ck"5Wr̘ [pF,FT%IdIl\sT@dž+*dGs,Y+ ̶̩H:_<zPG)f f!C\ڍ%b,;{|he .#"Ke9>FӔ`4]_IlԼ(26N YDaxy?o$8; DwGG(t+ &]hJL`Y1CdpADHο~Y.):|= 8G?TY%h&L  qp& VIDb! %Jd8~%#;n@J3%g:K7n1f=ƺXImL۪-&j[U>{Ck3ZU[ZT CߎK [1e GIPL>:inGSʶ$N@:"Q1XB|UV.uoíܰ e3t7(i z^DDyEtxx{[S'M^&Ba!HhΠas0V]-ackĔV؄lU1 vNB@ʑ[>]m,;\$l ݁$XYo`[yV ?c$Cjl;CMgb53cpn|}Z_v bu̬q<Y5SP%G^=fğ̦=kCt{OחE4휍{W"Z-V^hqk>ȖEꗇ*s[~k%J20EyR01qguHxѾ1 xJd{f^*K>`&Kݘu0xYht-VF$::" aàh=v[h.Pyܧkdp ,atLEȓpV"^jDwq7J)d/0ŗaHp2x,c/X6j]Q3 pƕ EBQBHR7+7ɈctU?Bx)ƞAk2if}PLZ LZl5-krZ4 A&zOy A8&F 63H/#3H J=7~3( Ĉa^2A Jok)VR#u`3B.K$g؟-zIڃ?{cz_wb`_֬;$ ~j)@T<3q*9&Mܱ9h{iݲM[HHpTvVsf>kev$ {k2p22<.FMb <$3d˂KpVo$$IzU{xQ/ݺRi2 RHW<#oّ߷7 \k{upB WaAs10|=I wIea3Iwxg6G?GoԿ-yqWLY̓Tȩ#IW8# S )BFI290 ){t[GHb-.1-+k5^7h@rm1=#x o ߗĞsfѧǵrd RϏ0Ą dh^yIɻ^Z]$8A~&Dm!!F N5xnKj(w\T)O調pRLHYeFYCn}Qٷ'k@03L$8u;N\heLv`\+gI~ڣtͬJȧ! qyk>Kj Y֣'np!&EWXrz" 9&~| =%nuHIˊd#$]Ya]o$8[Lmp<@3QA5dxtuۗo48?ΝLΐvܢm$^5t!%-x'6T+$I7xp] F֏W6 lwj4xx$QY r7sk7gW1qULWS%etKnřsLzt䉥$ L6󤑵_%ը&^ĨK{ *o09:Ԧ4`K,-J*i v%]oF[Ñ4eL8rc~#F+784I:YvAwNg!8T2GjdhDaV['$n!k9:NH0p.◭nk[Vo/{(zpO?`V߶ydq\/[!0+drL="k,BI-:Ob[ wۥML迕03=m^6$80a=$"fR5a ^skDdoO 3i>ܿ<`on$O+3jhBCƈC2Bqpep]/utD+bUK>}aIHb:bW2uTU|QzחnY\氼ϑ iܬ6Dm&zq 6,6LLZ0 嬪*gPUbepwfx^6TCw'm.̫sRǴw^e ~1 wv WKwr^iǂƴy i(yf ]v4S'4)`V9]^ͦAwmH_ajfv:֔uLΕɔ $7z(%~ ([,v*e" 4?4ʥl8\6k֫ Dc_3PۯQƓ*$eըǼ;z)aE^`7Q2z~5yk2I@W~+}g'09u5TU#w\~"C{3k/N3wW%}Qh:qYI-ޤ 8 ͒&HU!os^&b*w6J}ĭFZHZIy ֏ߊw5  ?n~#1?ID6z]L46],VM@0i6ܙ˜t r4&Y(zTbkd?J#ؼPi LtrQ^+5}4C_#3J%EWɰ߇7 ')g lǙ9`mZ. NZkF(wfI ٿaW..ctfG8(A\*N,|BW\G╯tUᾼ@#1vJqr1hǽ8x̽7QHVYFS&HB#ɗBC?(a$ h d>M%Oh*bmIҥ̙$K+mbC*?[OܖX7p8wRfKm#wjYݷ=O'LcDk?4;Dw nAw*'01=Snsf6> ?ޙ"k]ay4'WpkhDL6KBqVy5oͼA1Z ڢ PFV6}|@Xw Zo|-bɗ]9_y7@?&hgϖWuU0ć>=7J뜋7)d YޒzӠcR=3=]pUẇǞD%PgLwŋuٮE4e{jEkhָsXaZKDYs\G1!smǰ/ p+۵!d%h *^J.Po{qYkzJވ4x*'f?4xFl2zL᧴6SEjNakL6L+۹/o$ųͷ!Ѕ{fd( T,O<{Ybo<<j|8INO&DN[SVhIi DY.|".-ǎ\6HO[Q8^؟ N@m5 uVUq@)Vu4&ʋGOZk 8h2X#9D젘 MYl9 qGo{0UqA?A1[=drPiRю=tFY1SeyDLj!崋h7~]iz샟;k&4l-wUʒhՑU./4Bg;bJlIҥ0M$J>8{+M|3,ʒ/nGI3.*byvM1%)eJ:)u-81@i,4`PLVL ^ ɹ7Ψ3|b,%eDf=A)- 5\ًT[֘TdXs+.5b1z.qXSI)hZCǎięc…blvc4XyxѮ]L;M#255Qi{[h%fVV%Z#5 1> 1G1c"MrO-%1E<]m;NPZec Zd`JGO-%W+f,U)6VLZӘ$agre2aQ2V>j&2M f-v"eئ$TSGMLh*suO.i\Q!(;&4Y%=KKc8iAOD(qL5.^cQ~Js*[4g~J):AuxIb!S`TL2S$\QJ (X( ^s6bDա7r7zf ۦ/+뻈1U/j!q8Ӎ]")12'S=kGlA1OHL34TToʻT®op`mz9Ϙ+F~*C"8_|.4Agߌ~9bMj[翲Aufkd;{EOR5Gou{%{i&ݣpe Eϻ;$P<G-:'_3T3'?f G[> $+QoDH)4c}oq7⣰?śozGņPpuͻ^G-x [;o-. =]K D] LP@ X{PC˗\"{zM,mE Xr4%Bx dV(g JNm$Pf:$c*gujx# 5'OH. %<% Mng5uĠ!X4Vs?QH! "v6z hL &v1f, U: cg``OsHR!)vm"&S îab!@s?W8ŌT4c?X@<8F~4:ˠvfL`7WZ[`0WHT@I0DS#WZD#'QXPG: Iι0uW| ٻ޶&W^M:(e,4)oeK6WD1PԱ)|vvvExQ@4׆[&@g.HF"嘺4~IRJP|'R"Qehe dd{~i(0 %&za)j$>& t1̠O')hKI*1 Tu?k23Q&}+A)њSPG?Dp5ݙ.ݙat66%b2B8gd X aȴBh̐R,ߞL>1|:1XbMlwiPlfK xmÛpNUiBuHi3JUB&[5w<]LLEq9( C_ R97F LpF6tgEZ6Exћ\2]}9D-*ΧE, @G3z)x3);EMQQ pA@n%%Zٔ b},BFKK1",q"3N[ih 0/J,c'L$% C&4ɴI-9LO`BSkO+%n=Y `IepheXT4HD!K d> &f` ` ̤4e46B!S.ǣG`c1cf8=I_*mDk:h/0%C/y#@Z.M|6 .F&I]aeDvgugniĬK 6h1۠56m͵K; \&PV4|@I_s릻pv>xK-c%$>? #i"kSAXXf CJ(JtJ&̀xHjxXYj*mX"XH(J#R;ІSjJ(7q8SYM1ܤi4UVe.K d%9B Z:+ vYFZ .i+@ V=];{:lU}ۭO9YX1PR%#AXe P"'-[x`2í3kLPaI40ka)w $#;^T[Sw:vk%HT!̠L%N: U±u$Y($V &#E׸?gkio|p~gʂ2Q&<̺ipx~yѤdwܢ"IPEˈڋ޻eSGgZ$-~$}GP]]#E﷽;JgBVGI>]O98A^~ڵ:BkdZwDhdm" fkA50 +ZA&%. *"7W ӳhm^#>X]!vNϬfYIp[!0  \z!.mًV |/v%a*v林ujkс谦(sVm ؝6愢C5ARv1ĊvH`"xgxW쒞91Eѕ|cJ>Q(A!4XC`j)?X{e-[zN_)1_$ڼu2jPzuCc}IKjGi)[LI[ܜwZvyBZ08VCL0}pMC|MC=p=XU&;|M9b*_GfS(;"ik"&Ų 0rPqn"+}ɵj ȶs S` A9j{'% .isV+U >(=Xk恦zI2&k LyۋapE!b ? W(ʙOۛ <&}JaA!4tz#}~ۍd4BS'c>x@x¾ &yx/߽~5\F~ca&9ـ9¿dǻWjȯO.g7VZyΦ0t @a~x|>m!I쇱(bj 50~ήԎY &QѲAaAnAqw^ΗD-[饲gF}x(nj3o^FVvU2ǻfm&9Dxj|졿%yd縶,L teÌ`P[6FlIJ6Pg\2/BBL&kMeaXmsv, HY8[K+*)u .MfIۆՒVĶZ9!0o 'qGmNgM2o\cU5w;PΎna"LT :0ڹjf,g/8Z]ZC;cZm!Wpl{~={Wo1\Y௿g?- OB7Ckx `_`n$d_"S,! +b%ߖ Qޫ54GQ[KN2"ϣ Jh-9M6Rua9x@p_19ƻ#$X Ci."6 ם'rX͔C綒R!ؚNNsh7XƍVlYW٫{wJqv` ZRq֫fspeAh-:BapLIK̨O an.V3}D-ܖ&0n$K ܖ ՠTYрc)Ҍh,`jJe5e CJ>y)'wȬ`Jtg̊˻cV8<8j&\c 5mU )e Z=k<꣥޾1Mes9 &d[`)eur"탶ر=JbcAHJVyZx]/2,%Ėq/4fÏhNmIPVV":L). TL(*S ϓ{L/^J&,=uhb>巡2!Vn}&Ǔ|64 Hyl -2,MoFYS~lbЂg_.^o_H~{%ύy|Th̯z0l8*AwDy2eQ3fqeHڣ$dы=tI4>Vj*Dzcvn4nOu` sN X=E9؜Ŷ(4a 7Ejrӛ/Er҃'o f2+nYQi|;t/7.^{7^O/֎WP߂aF :ڽ/ŐR,"։bN2kcQb,={؞5 WؤiGbF,6J;MX›%MٱllI{=MJ5#;HJ* g}+3Op0bfSj1T:@WO XM!M1Sһm<i & ϫ|9bt?rw΀N} % >\*pfS٨0QؗN&'PF(=?-n*Vf: 8Bj3f` kXٍ48Xnp|QeӤ\ǯ|@z*o\R^음E.//WAqe>Gs|Iُ#w z(okƭ^vb R~Rc\ͼ|/𫳻n) p[$.h|~'z^K]_.Bo[i+$%"%ߎx ~GmӮ攟L\U1Oiim0=%Q#W bp$T) bUip3:fR@RkɔXZGT`^ZY};Ѓwz8x}9]`Ӌf0ì_ہKm~'.V|<Ѯ/m`>PծR/|#Z}.X_/jK`yCOڪ:^ o!g, w $P l},>c :=6FRy<*3Ηr1i>uT??^ s?/LbB~_ťoz-zG~QdM ::Qmȯ|vd)* ?{=,1j?}၏>8|3ƭ!BD(` ȐLc, TqTGLq')&g2m>u4iə Om³T0Ol(l'1W0pPӐu)buX ގlnj?_gOi+rTLնS5jgvjgvjgvjgvjgvhV6}$g⼽xiۋG"aȯZA"Źi$#"Hl>۟ia!d^nr_>0.dQ bfk|4'l.㌂*f.qJI ߜLؾ ¶4#S]-)ÒeTktz?)RR%$CRJĐ3)kHʖ Fɭo#@۟%a9?Q|~^!wJ%jAQx6G[9`wb3u;`V-bY8MR ~ m8`wS JybNE"c` NVJҚq96zqza:!AѮ& k˰}?<FT  ;sA5ffuHٷJ]rc{3.1{μQ^vG QvLgƳٻFcW}n/8' ;/]z)J&)ן!E ETLچ@7=u4t`Z7Oo>se~oA YKQ2Z[A;EpI t)L"s%@Qj$m߁B_'DrE7iȃn!VsQ?۫$~Fםӱ[~F+w2mόO M7~{PݔɂFZ/*O?۾""|΍PnIrUHJlg=hQiɋ?qͫ/}_dzm"S Dw߾E/b3!S 5FG!˫Ƶ 颐,1uDȉ՞pVĨy;PA$;oG(jH!8ϒ,Q6wc(u-=o2YO[u묹aouJD/}ܥwӛ4fXP ,w y%G*r֚Ï48ydiMd.q%wbv-g0/;CR et6UevGՐN} +UrIJg K$Is'J$|,JMqBxE7 tg7!**\${!j0 eC&Hu"}LHW@- ~hLcEyf )S0#hEMTK䙠ݽ λ-.dUr3.A8p ;#\X.Λ؃.fk]=P>}}UZ>})FY$Ӓ ,(a .}'4nXrĻ>NxቦmMof@`7÷|D[c?˫Mu⧋#(_۫Mܥ0tQ_ f;q{pwxs=l*tߋ/ިyX?/>r}\}R=V,Z3ь0 "N:JsIZU+QG~IJM {VKK@ـL̍iqP7d&jX'4N!_>n 傺(zx|eh*j*ڃB6D %tEhtqǢ;F6J8.,hwYr",VHzxO8/1Bs~4K r|))bA^6f"%> AGXʅC9nXtf};D~-@+pČ`MTIyeR`8< ۺ~CN*`A"OW Q*C5y@ۘEԨ%‘")XT&!)G\oS$bWW"UNwv Rmqڐ)P;s`DmFrBDQbl-P~,c܇)ǣ"'r`+$뛦aDFw]5uָ=!Չc5&66.! 3`i)Kk-jXl'D^A%u R$LB(s1Lj^9 b|,`pEa_9~gל猚SX_5AYPU;Yլv8| >d;}A&ghJhc,S B6m=l8U9Oy\.?twXU}͌i)8&+G遂$wcrPe Z-X=`(45J1Dn |$lriMv01?M(O"ʕ"ãSmdMqTYaP9-[XUjйY',|M&d<5ASD<|,WH/8y'>Yn_>^ q><6"$(v>p?M( Y>)+(#UۇoWuf͙=6<M&y@(MA- NՂʯpןT.@D$,qpP409~H,C&Yϧ'  %o NƸ7e{FcZ>rWOTӉJ'!;Ry"[+Ծ}U#}U#O=c#;^x~ʤ~.&dz~AS!~)pzŝg4ucĜ3ωMO#MJ}&421BL76>m PncLll7؞>OvՔ);e}VTSRJ!!49TJF9r<."vKQGHM|.ޭ]>k'lWq: HD{x7=yNeo g%|nfV.L3:z'V6`6v- U,nGe5^XT󋇷^.A /K&1猝KVӭ UR&Pi#. SG@ICϛ]\^}l&>M=eQqW #zx1 r{xTW2aI3V R:Ft n̴ Y"VDca"۴2$ G]0$e>|J\x"Jp &hT>Bq&]aZ>rjQyKFuOb,l0o 4+rzwtthľZ>W1b/N.يRfFM \^$Pvff¸-" iC*z@.)Ɂ=]J\5G*kUēJ٬[ib[4j9CDwAZsH]Zn5A$W$&-j!*q!:Qik[r Bj=D+r$ƵԭL d9gjeG8!!A|,2>͍_!@T%v@sPyir[Aq䵣 |i#gia\b9f_(Mv\l&`b*;t+Inš0D)/l hKrX֮obk7Z>]3_`)POP۹evV<hUt.+Qǂ;ᔀ ʀhP'HPiV-t̾|$l~ԇ*N 7Qq{u8r™|y{1k8ѣտX1W0tZT]BM#ھ 8P>C;횹zïHT6g(eaTlf}[O+J{Ԩ:_ 'qZIm<@- Q_%k8|!htSPB>Xgn|E("MsVK;BZ>ZKk˺E,=Y5Z,MwmV]j޹+n h%9QnCܩ(W ;&Gi]XmãQM|/U\ãBK=MCم=<*65ŦL5JqD*jՆ.UN.(Ͳo}Y> Imַ/|O[Q3;$Fe8sT7Oj=,%O,2 ר 3g# (&?. B+mD>{; F$'UsW2 !H5t@p-UT%N=C AB- d> =Dv='RH\3 &+{YWEF0zxT<{L +=EyO+L׊i _XoӬY[Vz/`qe *zxT\`5Us,@e.k-y!K#aQ~WTĤ8TVM̹@@1s F6>k.nXt'L=h[Nxi c턥pk&h@nku/lrIJg KM$KODpF]`#ht-xKVOYaᖑjsˇ/T6HB|K<(]Gͪb_{{pgDn6s._^/6Mǎ ceO8zf43T>tXŌ|j|18Y Ɛ%%@GH݃ B- EϙL姓FV~չL5zpfjYj}03^iBĦyE- /-c QqHi[8y˄36;UBK. q>aǏ|Bƾ?C\T=cQI'֗nf0ZV;32)NϐT4PdE&Yjl5-I[0eJ a!(%WQԍ`?{8n_1] I~%!r .OsHQݏ+J~ȶd6%e؝.J,UX@>L/ \>0%](\R)?3, GTÉrPJLyG)3ݽal0s<sgG׷eB(}~xZ.Y4ـl}cCJDXó5HT3*A3j|A Gl)K"<|ȘiO:plbP51([~x%]9 47/Rd`v=FӴXpx߼Wr'UmpށUUO 303z,y6s,=TK[7|\~m] ftCzlt_*ԦW-L>$;yc,t8[{lH͉4^6'R/^$ j?=.#|u<3xL+Tо7?Q PJCl_fθr * >*#g:v<(v|U\-bt"pc2"pBzNh *^X\xͦI [5$&Sfe /HzF?:pq oOg8P'P$)a &)jrI NDFZ R\X >@d1{,*vpd9;yqeZ)[F!w񮘾8f퇽@8#Sd2?C Ѷ$$ gcSPNueȹth* X`y3%BQjbKV'}䑠)O\Jl:P6_'N]̞?fg?d~_̔UlM ][qݭ*m̋b9e+ F5:ƻtRO=1 68n9:͙޿9(IqͱW3d{ t'xS,?bSZTI\~r6wZh\.hx FY,^rxy~nTEG1ݨZ*{?Oƾ7"?Ұr0*s# ~_lPmbc%㾘Z-n4L`bHM^>Oi1``f)=CA|k濛MYX@@l:z׬kBՌ@ <\3O?Lq^F1wg^ۥ|[AAcngbVܞ>^)sP/:9 CS ܨ=/kF<1^ l.ƕdԜH1+n`H"Uhe.uF >kaF;L@a5&Y*z>LcWJ|R5*VC}Z|[{CUJRO(Vcb)*jP.\~zmU"mJ Z2R$·ҥ.^ɓ8<\b*aYotT*%—:'O`Vw7vs16Ew>ч,OXg^dO@j+7]#(O2a:UL)]FD,΃gEn8Pv5fGPís89%{ib2Qҩ_+PWyҽLnq~i슣\8FDz[Z90|Ǘ]da+haUۦBG"J {~(̪ xpZ.9`Өb Fcut)3@EԮr|i fh p{hJ9իTN=Ld_h}NJ9n-Wµ[8]=M ìQ`*vd NbϿ=W!O8!W'l܄ooh۲N>>l89a gYIl6rf3H^siN2@&k׌)yV2>gtpǓ/j]B}IeQKc^ֿdeή|6ʇxƣn'9YTÊW)D.y>f1LLQRt)RU6zB &D#5@k{0iiASS&"|>0Brp mHnT8ws{ >Nm'j6NoƖgtinR憣oWA'Զчr+ܶ[c{a!;qIeV) (4j5 `Xxw QW:KܮzӶ[ىXw{vb;#ZL sh Dtf©A2e,"6ZrCiQ%|'zݛ oۛ nXYgR?fǽ9;@`OC*X,rQqv;߶ ɋ(ӳJvL0ʦ/ht|t`,Gk2z(;ҳʥgb 챘vr8yaZ+1>)bPޔeTXCXp.vP:H0966?Pʯ_S^'a#m1S&SfYlP/ U6~{+>v2zᨐ {A$(ĈT q2Y,Hp~$2m^SQчʵ*YR*;p](*uiN#JEjk3%RzpX0^;3 ʌK"\Ҋ bǛn< ް> d7W;Mtʹ) 2Dm2h./y&A6sz]y6jjD*N 0֨JFmn0݀~yO-g5sO%v1Yfe<)M\y<*wheB&|pC5oϕ}pI'45?K&v#a;Э5 /}JZ=F_VBH.ӓ/gϓCVZxPE#dkMA Q]+Mr G4GM45$5;냨]? 2MýCH|)3Z< ";}P$ed=?' {~c#R{k%Qxy )8]E~ԛ<H} Sz ~6A5W|H IVic)G1xNYsszU!@B: $?p1%з{㦠GQ+`qJ6n.kfq,F2 ɻ:Pq5|{֣PBb8C[?VAKǵD.yhm/;>|Ε;?BIIEAyU_G2 jSUŪl6SC@s>kte_=rϦ6P~f:}C@oxzcH*Zi44h#K2UxVe>hWȟ(hXzME> V"MCc2*qRw}޾W|ة]~Y4:8u. IY+Isf38΀o {'3<; 4OkJkMA"YTȥw$z A^p!W~3 8{fch ȣ}|xTyzpszg#K#!8>-8ᤠOţhA+<2Rqc! pI*.$*>s3 g}W(Zo/ܢ[nQyݔv8RCGсb)QXu܁5"@o# |%8Wdm!RahV{x+='$-W&+cCkrh\ dVҎ6&+to/ ` D챖5)騳WoG.>?6Y$3%StJ%FUqi$He4:~/%m.>E5/mc;oǜWrǵR#FrQSvEy?LJ\'Q`նb3vAryx+;RCgqӀVxG _v# 'Z$]]_C<_/m< +LRyw.dOocd%'"gqOx8~'`,8<ƕP.~1{zp+ZȔfbkE秕_)'m͡Qnbcqߴ>\im#/-1)("r3> A`#nYu}9g|f) 1b9,(_[¡"i:nޅOocz>S(H >c|S<% ݝ|]riΝRcE'y=l 6^gl1 Gu[=lldXuvwHh e^osV߷'OHHQm)-KUBaTQ.b0NX3sNʭv+8HmJE&V^(jI)7aqZYgu)C_@2efL3|Uc|tnpcH$6l^)1%I2RE5BwZ AUr*7#ǣ@ i81RӃ ^-r/r5x60xD(]љj^j^$1ޘQC!\qn` 1 |URN,I)5 SҗCv/]%yfBYň(VL R>rZi* sκJ|\[Ѵ9:X9_g; tq.'perc]4p~ a5)v4~Y6f09\ȨT.pJb36o΃[|v>qkb3'}]U=y|,4i?EϫGHvg j-K8+Θe.D9,-EY49in%KJ)̙3&M;3(Lçw[79%HnTp#DK#gE囸_]B1ww\u}pP\4*;@m53L1o2M.󈘴q=22`0BoEΊ{>_DF;svnʼs`ϼh%⽿Wmmg$i{\p;nM_'擝CȺ!6;5Z0N^l';_2'~LHp&GpVںY<pCg5w-1jS tRz癳JX3{nV&M4;&Lj%:彚5}ͯ/+?O~YVy5I|S`VD?O~me=O}ƆI`vw~-/q2 7#^0w{p Vb %ɇ5/I&V?}x\tI?%Otp7K[?~d 4#2V^%ɚ?_!nӇ Ў{U)yO[!OZNjXoP&\R)`Q!?\VcL?1\;7.Z.O0o˴K1dG[(8^_5gͿ^ ^MzvbY1%,F"5ȅ9(I7M7w,>u⳹xl iT0cv d+C<$>KHlf ?Y'DlS->}Fגyc/hv99Dm'9\GKL o"DdIF{pTl!fQEL^[(0<z/7j}<&׿+hZʷGj;jLZilݳ}`g.h+,7}RI iRX &4;fفo~;闎sN!,!2ar v _fu{wf(gH~6JkOL ⷟5VSǑTsTqNȩ¡38&½g'kjGy{WʢT.j~qM%8??/']:@ `cPW4 a~nPv>h90.e,pEmgHx'vcsyWYC3V =7 rk44V#cF2&`j]ї|`)HUD@@MR䬝!Ԫѯ8&Ci>3Jw  KuAW@gp LA8M7lONJEʾMq>rQf ׿`zz!=)Jn 9*$0!RqX)|!:*1̍!:C/>0EH|PFP*t%AN+΀d^q L*&? ޜQJ:NH vzq{1u}t2ܮ渱ůR߸ɲ6%'H^SAG9ĪU:yd QrU vQ?-u k_)!UqB`큉(e7]N{W@W]n.x1JP\!MBڬWGw׹V5^O ) {p?9m@ʅRovrpaA0#T<, _XJkMHXB@Fa6D P].cT "nMb}O9cQFO _q L0N'ê`*#HވҸ"|d =Tk\~Zw`"Pժv":oT\E]_VZ czo/.ŢL_ĩLc8HֺPr͵R!;2 aLd`+vQl8 랻Ĝ10qٓ]}rf}vcM徏g`: Տܹ!+G|bM%q y,~bA"!c\7Jd`>׾Jm:LAf6lRTHlUq YQ p#O]"n ﲩ8M m??HǝY9 ͋tj޵ge` *HbL`hMRF;&oԢg̥dBH8SQ +W sRi)Ȇڹ+z&)eZp ̜\6^;\ )^KL(hD|tb,kGas_ǰ̑2WV0  q{U0NFsd ہ;0-BR%m×tтc` |vt$-8fdPȜF Q"#&NQzxRI1W|fKs;$ǰQ^rR"<)]HsE׀!)9r\R;Ñ+ IA9"_^eр ҭ# >'ŕD~piJƉ}_4,49`tҝȱH:j5W6\%k;S^F10k8?Dd,CHb=phU$`*Q#+f{ (=,A;Uh_10sܙ?ySa5rF+2 >`EzIb, ͟/*=b?*9Ę/C@[p ̜ 1}$eH|Kx$9HΓƌ9L52Py2x6(|6}RT%%CEr_[Nk4E00[2|v.{$'%5@n10s>+0ꖒ5E(Qn4Vϗ}83*v~^i-8eNj)*s+.dv=>0C2xZ` T̡ +ݣj l-6HjXE܃}WxY""d'޽cc`0skBL֘G2ڵ!Qe~t` ,o{IsZp -2g"6Zp^ _\G8҂cXPTLRZp (rn$C (m]5CO<:@"BJREb9ex,1"f%u)BM5#38 U1TqO'{~ 6 Q QIx]&QCi@-8f__MvG_Ut7E]Հ.|ͽG BR#;Fr 0g *//,OxZ37Dz_B00g2Dxr'dj`nhܿw]ZVHR v_(v:`䤨Q['ke/ -]Iu]¶B:d7rXl 3݆bU3) 9A0)zsٴ6`otfY@ejS?v30 -, U!= 6Iy%c8+oŇ +EAsk^RIfpaf2Ϋ@ӎ9׫ zN%]cd`epAI*)'}*s=fIASFYfN ,ҠKٌbJ?(W݂c`d(PSrƠIw8`wu[Ԃc`樃D$z wfǵFR2{o%5a՜= ߴ:?hӑLȟ')B}/g: FӜL T"},;l8/{7Y.>X?=tATN>HBQ0Sއn_0_|yG5j)|R@F, ^"ê5Ÿ[ƃpQxoOϟ"}?q:[=y7YS?v7NPJ3I)zfRpl-i{2o˃ŮSae*qknhu=Y,/#6/N`ʇކ]Φnu @t{nزv7U{ ǽsb#0چ' bZvEUrz %wjtet+qT/g\~8)EN5ƔQYd5JWfBa9ˬ0p]ߪPn@X~[E/?vYlXVk,Ͽ.Wm>`x5OQ X6x~n(~OK;O>MlFi#6?~ogU~؛K ;nv Zj 6p6Z_/yޫ8EeڻN- $}*^l?vE_ue=a/aU@>ܜ.5aZU yHa-Ӛy&FGJVd5ƃwܕZz# |Ŗf0gA8ܻm^gDЯ"L5-i ux^OayP|I&ET-"'5O[ Cك?<60^cΩ__sPvY??;H9ŵ2Y^4]DO^ŠYӲ~FwhwQ1|cTO͘{ Y\og|fL\ŭ&y㴊dv@@5cis0׎QHH9J8\ s;41/?lXgg`˜O6Ófh7HSG~;,mpFG( &?@E]_Wg38ְ.VMjw=yY\%s˷Dkʸdx1սhu/BuOE|w.!d `I|B| rOv+h";n9LħO_'>BD|_U éR'@A"n:)r/=)'@A&&>!97>R-wxH|Jn<ħ{9ڙOH >BNFnW.`bݸͥnr _n`}^ݬ$} ]` FDŽV()5yέS0k,%b (,ӿ*q߀ 3yы~&=^wmC ;cA0* v '+sMII5PC PC}m);OxUnCJm׉OxYnOA|,זA|I|Bf2O׼ %,Kg\kr3eT.! WO($A|,{ͧ^F| 3r}J SEs߰9TyIO TCB9Eld=i";M9.zt'ϋ(ռU#bD]OWe[8݄^]۫|NVv^[U'MUc:^h LjF)ebR.(MS Sckn \v!JdGx9" :OxY'@B)2=<{bPJ5.!d!^gJiA|sArkJECB + +4uݥ@A`TAmF.l wDcK(؏iߩNvq,v OlH=l6EH([ L(ޡ&@ȡj*r*JYks]X݆`ؽN| QRlgT6J{ )Ql1y+Ƃ,֖D9!)r/ν-#R -5#8'9Wx-kn ySt`RpJ488P(lS#6,{} D{m| -lf%{B=!55x,,eSBrb=Z{Zώr[SEp$FDnҋ|5vUE/D.qi+K{s|ۜEJG:q+DFtbU+X,GŲ[RmJ_8!j{Sږt!K̲<ðNhRd IUr!qٕ+!Ty)mL;OxfHZP VİbWLIv†*)r/̾KsKIWXȗ^ɂH9W2M,))Hi9-uْk@ߜ!Z`y:13\\d5z%,wF"; \~¾a8 _S!z}9(|LB1cǥE3%'\8)H,JPJ*ʣCb_SA=J4nWO+SO3䇺8P~GТ [ {ĽDD2HO벽R dPB>3AI)Bџ}qACN*^ܭ`1Y'v2?ںrhʡ+h|rX)\#Qufx25XrnuRX$ŭ慿?s/qgN6jׁQF 1fr 9ҏw͠le cH{圼W{圼WU'oU:L\1jL JFcID3שfLc&L &ulASkpg:Ia%tz}=N+n a|^բR= 틎fEz5:\:Eվ`)ˋɟ `DeuQAq:*:xaf󧢮mQ־tTճdJOR9&ɫR;jZ|v;\6M[;B}ԂeSFu_{LJT?:Qw#v>akӫ15{ƂKYĂ8fQ66K %Ҩj1 c*6Z_[A. ccIOKk] m%04v#;ay0a[{u&sxCedP'.}O\*bRq\jqfX1VGڼJ";2 d֮Ewc%˃x z:!Ey 1-ԓ1x5Ks|g^==!iø 9gv6Uy8; M4BH<|◪Qݔ˯,yb~J-6"ćQEehfM6M;8"|BRbB˧/[rM=Uk'.fӻ UmrÁK\\|gFT& tRUh>Ul> {C?O|[TIJ O~ \}e^O?fu|;y"T^O{J2XC)rv.K'ϸXT "{'"hCمB>|J+[RpYܣI#lGͱ["I.sNUP(76ˑQ"kJw*QA]x*!PnCȮ(:Rk.m Viڦv`"aI" ,8a9A sHD( b+'Y[bTZdԢ M)B=BvӉJ(f#EJHr5W)3j I2mQD&$K@ )qd2^O>f$۳ħj2_ <}( ]PC|@ jo r 0(TF9ꯢS++Q2z'+ЯwGoW}ܧ2 * }@C4 }@C4yg`^8Č[8CLb,QJgdxg\k5IOd#z y:zӾf$:` w Lf#(sjy1 tLx H੢(GDpҜ`m47&MJ+-ΰU:KM;e$Ȋp)2'0#ShTB)(kH2qiFف4{ 9J*]?J]b=˨Pn&%.Ql4?1px^]7e W\9C43nSZT\ ,9т6T0$ja4O#::"N7.{ͬ# V"s*HaHc03,WŪ( ʝH&Q ,&w[}Nʜ_>n)kOaGM *Fs')dVZL`|96I06;s4y"?@ u4Q$)șHjN-Kcjr S,1VrA]\:HҖ:so,1;[q]LjFugr豫2s {|Vs`$Y 0tna/f_>S0{Kc| 4璭6z z+U\c)/5OG ?z?O-(7I^=xsW_\a.rqp|Y t)82vckho54UЂ>W]ŸE_Kn:|7.i(r鮃}l7y}R_ur+ЄgQRUPA29S(G9Xf$M]9fS3҄hʃsVJ1co9 VTޚ8C paMX&b7~⩭فgyq iaFhElbU51PH%H$96+gERBYVfa7ۛޛLl訵^i߻n-0dvMzrөw3edʵN&nP$j#xnLY L$!AVmJw 7鞋yvL#]od^ /H -_|vbV溳yul=Ft6gr9mg"3?v;=[u'({z>ڪhbq r,N1RtFOr|dk:$.b)24DvOdGβ^YOp/88-[G2O=;GO@<"Zk.0^>4<-Y 1^:nfxpu t\nqj\[L)k0e 34Mepz!o H/҆;hy}O2@XO`\7 dU +wL?Oofљ]F; {ƅ=v4+ kډ;o`o|[dAMs`d,j⊩:rz4(Jμ^A"6VoTӂsGG]n^tE0lH^PÖ Uw%~>.oNU\eH{ rH{ i!5אo<n&Bi9 9!4䜆ӐsrNC#o(1׵&gXp4˒XǮA a1lnYiT+qt"z;2r_[E*1l ӒǚiÞl%KRsoQ'vdNg5M˥q01 Gsd\sb,;RY(8͵VZ*X\}:`B6M"n@?&u1~jǏ1D0tG8eaܞ8N$Ή%1(8Un5a cߔy X|6+i3̛櫾N{]mͻ6qvޮ4m͞\5.څc.$ܥAa˭]<F!΁N,PNXXP=!/AOڼ9G {issBΖ냢;yrG*Ex>EjPw!鵆yE_"VC4CݑՐ)Z!aZ1sy s֏6ȞCkN~l/?~)I[:1'W ԆkUjSekJ=]V.r(l>zo>>:M_Uh //^P2JJed*YJedG4e8'}2qggشbq][s6+s6WalUI6U{edɱq~$%SJER0X2.hF?MM¸nj /ECZYWgj?w71?vնF$N@M+n6'GjT#ˣ+@2V'"%[EQVplhIFq/ 16uI/=v+%Dj)WK(ApaDJ1* h :q4r,ֆs8Kz}di(tG2I8ZezLˏ7?>]E+5Ǐ7advsY)]5eT:qØҊZ2bP LhB`H*=X3lv$ WzAb'1fVpvuF7DJN3ˤ"2/03,SaV+ M0/, p=h/bpc½2Ҟn.*X+d۰^\[xjBiOñ/ԧ)&g>Ke*1,,jhjvpf`@P1$u`8 ƒb 2M x.%2L!kHu}:2iQjЂ1&zsǔ3 QN]e-`(?b*5cj*gj7goqI $^ Or[dōJ, #j2vs5Xa+hi5DEhPvQ0" D{7p$c+^KV)D0;tq# 3tx1_ OtR~W *:Eȧ#3i󻇫t=@IJόUIGw&DojzՂWWۻ V0ŗ<$=1~!4E|j%p|KUj |#-R эrmƤc\Kg_#PHyMJB@H]o!7ed=!Oʤ分U&J@@*FwdϠ?S5"Lxj)jڙpI[gB!0cFcOS sQFXp4sa^qq3K_ڙ_zf5Ì'kWr׎*zBiDTCӃ\ }\` TMl?{+ Ip'wIp'wNIEˠ1H 0 0 0 $H+)GA5\WmJ3N 8$$#"y@STƳJ;>5NBCu3{ a1+E!Lf(2SːQ"k<Mlf tY?uؘ_-,"+@9@eLP#=#=#=# #=G{$,`H`H`H`H`H`H`H`H`H`H`H_:CET_?n *umomTʣpXhTIrL @bP]P֢pa bLpX2(͌\1U,Q\8vga`uxm _OUG'|љQ@Bp$n"4i#՛$uJi0EH.Kb9KXq1հu/o yAa(єfA eLVpz:1DQtnQ.ލiQ$sq57*a56B+DGŸ?]et?[_uaYwxcW]ġ*MI7[]x4r6E^[}gћojT\s~UUοr5CwY)sgld!_;e ?xRQ8ԥ/+fb/myW߂ϙ 3 xI8t.wI] {ǐO@M- pPFA~p01K&/s`Fv4K#MrxQ|uYNyx%ȓ!+4S~6[>|Sv2~sF6G4=p?4hՊ-3 4r+q҈ݎ%M$X# ig4d1-+SNI:!jsr7n{"y$v8 )Q2:Prg&wE7F,7W$2."'-r8Nϴƽ8a"\#݉qMk{}|N?&1?iydKƅy./"cqSwy^8=eܢ+ak-B*%= ,<9iOv>gg6+4~cxD0 ?bdy28Fm/m/m/m/qo{Ց -+mVLaIp:p+壈WM~Eq ]/RbIw#ڗ/n2L2z y%SeT\"K0gxF, , , , ,c՘(,K0K0K0KzQVI_`` `` `` `` `X%X%X%X%%K%XaK0K0K0K0K0K0K0K0K0K0K0CRKT(%BBy` `` `` tT#0V wHiqRp|%,X_ܟn,e4 i=1Of7-+AkR?MhcOIb&x (JqA1ڧf1׼^[ĺ>^gjA1sb}咇x3 XWMzeT6(VZW;v8#D+KR{xNߨAkϤ%?ĢcwX]GFۃ։VgDk֚x5}Ns<[޻5;zP6J*55jy:#Q^x뢳 nlnXv*ZP %9L0,Kw6 j'9idē,Jý&p(zJVgp7d-'W啝 }veg_tEc Jc*w}_1`oj~TˋJ~){l#zo!\ͺu/ttyZS`>}I.y-fXL[7Ϫ|n9q'j(ZPbsk%]PLe]ԕUmCugSî2N(2Yur>0| N<!H')OuXTzu`څ5\tꔆLr:å 3{jǜ6e:4‚Su"Ϊ]qqWد?~BHP0 }})qaϓ_ +VEy Nxzz18ڇzω,mcRB/+? ݨ_LڭFt4 iI'E[p XΈIpcQ_譿[ӯy g.%x%hviQ sކgZN_ upU3᫮,z,j#l6>dP:Lf ]'mHWDR߇ོI GElBvd)8A}I-ٔEz MR]uwUq:0tZARAzI¶Ph1j)]k(/Ťnr:;,W]j*{`FV4@y_Ř:dګ]ٸ g~4y^"fG^^K ׏"Z7VE\#/z8Z]vuV:0bߛ̍_PwV ٢Z wzz@XIw5;4^Ewl: O7F~%bݖW!t~TM0/'$a] ]P^4_tւhE7+P] vԯz@ޯpU)>/XosC㮛4yX#k/QiD;v&yU>T4=vG>mgA~SНmF܃z}zO`X<>xǬ7r'{MUznݍvz>ԧ4Clņ~}NKˮoz64H~ HэC9t{ͯ-^*YL-rmשU}9 Qvþ^&|#PpmplQd#D]oD>(U1Ƶļ0s33%9lwf2(›",k k aq`roX|&nƽ0;[?~ LaT.* h%rD ?Lۑ85ybny(ϖ[f_O1@d@ D߄>/as؎z?2g^|~?`d3|3mvLd&Y-Vr&uqfb>|>یE}HL=X|ǍA3;@_NWk|W`so_LHX"SEDs"HA1ze{֢$'miڊU] .ܢH7?3$\.%6!`BL`FJ;f2R 3p-1KĮO'N"/~h:ɺ"bu)d*y| r:m/uqF\.S'tGSmjX @r* %ޖQ>ݺKM秣=rmDjлxM CQQ#D CvTf\k]$5Z 7]HumI^G*#R6IE$s2JX fᴒK`U4OY*0)(_J)tr2*/FK'aG4FOPh%,m~[8Q֛ͨٵ5ΪdzK3?PO՝_/Յud$B 797'8UzZ(g[&!>;} A4̾bD ݐ!{;T'w>| &,&->Mֈη0'1B%nEo\5$¥y1Kbc8K7 ğ *CgIRz;H<3i%'LJD<_%M̭(gY1XJgɬ0 $߾o绯?߾co_(S pznۿ5X[4U~Ђ6WBx=p 7YVܗ"D &ȑHrń],z^B;)0n' w$1I"rb3JJ" L"Xz2DtA{%IzhZ J9Eau蒼~<5b!ކ6;sxFfV$ؠ ȴa &Gd6+J(uZ9C8|#?.M\ 4aAig0vNřg&oHuw CJ)NU>Kʶxl촫ޜd?̀ffR[` Irq$/SgǬ\p6ʸWL<0k{(FYqr2Ɠ0C-MAY{(OӼ>L0$ W3G~Ir`e% KpaJgE\nS?. 5 ='KڻdZū>do>rp&f_e6T0{P*x2n3H9vȎf9dLZ:x[9d;croa{JIt`)1[)/P^N9L Ifg}5Hú3'aNÌYsMwXM=(! Y:ze0~\!b1E QQFNTPQ(D1#2Bjbq3"5SDc&)،S"!&"Ŝ(aH˦jkӹT?̣G.f7)Ἢ9ϛA*z=uvwN>ί?m '\7؍lrb('aoX8"#gI'FT~3o+̻_nZM&#{aЎ0קiŖmc> +-أ-U|c!H-(y YbSD7|~eeX)/ҁ09E7ґ˺k*Ʃk* P^ҊbƂ垂4gq9K1P89]yRt<ӹ,N'ٱN!b%_I/fVвe~uhR+e{jX+S8+S^\65)E-D:Aŭibh2=SDZc?::YPǺ^wuǩ^wuz:ud61"8IYAOO&P"e-hL::`)Rh;N.J 9',l:g-.9-8k k,V8d./(l;oC*: {#*UXqbP1G >eX֑aY,(#-J)'RHD0ˡYΡ1N7e75 Nԗ>8 )=Y=9<foKI߽yo뫽li;}WMB;,ی60d&ŵϾռ']Ν^u;ovW6́ډHn)S#qj4NFJu7 %zzBp*5(iFG@VpeAˀE: 8h mZ=N0Օ'u`RMVk!H9mg냣t#ى-\HQn\oܠi_C𽻏ϝBE6z8y:R`B<UX-zg՘ID`RȈiDk4hKnDC+:9V= 膓VxT]6r<~ : IMRi7@Q;7 qfqz9CO1,,VJ\tWhmqUV)8l1oҫ wH)IᅯC|ek&Gk8^H1 .%I%bdAy5|d [עzj]ݓV,S|ޠGW2<{P4JG J\B:`^ek) ӁT p!@e_E.qVqLeA3 ('@׆y#ȁe1mpحKc=" #$$5 Bb%#11)DXҠs0#ֱ6xlOв紞zN^4ui$ =8BE8/ET>!=D/S=Dx5wLC1.37RB-D%K]wąː],w|Y+b|n^U4ugܯkZSoV~yj ; WeUỈҎ S]6Cóx$e?u8tQs>fAB: 8:(5=ŎED4L,F0D˙Y .d.5ĥxtTϱNRp$;Ks:q:qIpďsc̱Az;k2` R 3Oy-yd?'1#$F!fH N ~sVk/;^sXe#$ri?M6U󿪍'IF5>SO ;3@tcP8]Zy-c18m11sH,L,X$6방˵Z]KLLryå 3jg>Od^1Tk%iu^ŵu׾ͩSsRL3\\,V^ ߛzOcILݷ¸ya-ߔEa3* @Umػ߶$qW2~&wm1Na6s|aR/1",DDꥦ0+CʘtgzeZgkM4iP6[z`OYkWʦϥĤT+1y3wersr%T8 yS>SDc(Q"! %K9 Q u|e$]8;]^N-Gs(˚6,jqz˶8eeg\!Y0`O0<,kȂ≅b2qU{uv*6Ԯ>Hvq%n>hRJ)wJR}rX&xP6F/˹Vk5)2{V ~_ok1Nx684%&%b&T]`PϾǺ ;MjzW\&10]X>75d oU.'xq\ Gd=6KPƕ|<"^RX6y [(sb^[3H4F`aXU WI mZ?noUsYZ{V\AP)ضksV20xh>Gmso6Z\?{[LLm];+S3tG@1KK^Z& Enbvgw) vVkwy];j}-Cr&@q8-_Eʎk/W/.v/:4Wfbe 6DIKwK$H1 ʤ{^g/q\ /,^#uH0aMym~sq>',9ȋEJm*#fQ4X).DUZ3!E-9 Hu!j :XFlOJ%3o㥌3%[$-]et!'8i)ivO/S%ҊbaƱ' Sp!H< ̓ qJǐx/2D 0ʀoLI:ڪ]fZG+)J-c@ Ϥ@9П/_uND&#dyG! N9@kNAA#؏K 竅wMz[R%vډUo%^$|'p E4+b@?A 2`#$ R9o}ܥDÒ\ PK)ʈ%s ƭՁ9c U2 N6b9#Dir\ݶޔ܏|v';6,g AK l@)yQRJg(Q2؋&ѵC}]L8,\I(Gc-2.rg|t`+_"y? 3$*X!8XtI'"逎[Pz#,p0vE{h?G%2!%7iQ A 5(o+Lr$ؗQxa)Á"ȂFYmvx4jB&RZ3 YjP5Cf `iIj͖QjP5Cf ՚-XZ Zs%]׍Ԫ; k.;4w0#OX֣qP(> ܚMb2pk6/3pkn5f ܚ[3pkn5{Axȓ[?[3pkn5f ܚ[25f| ܚ[`f5 En:,S.f ܚ[3pkn5P2pkn5f ܚ[O's\R%ewio..889< wWJr2ח}d9s~wL9u UY|i&M.ԫ\^ịUh4BkM̨3hQ9b0XJB$dKL,y挞L5O f=[گ6M)P;wZ K&ׁ fT.*VZMMxLK'2  .!"O)9ڦD[~ia;ϳճ(ZxTYaNQΙZL":XAp/ 8(,QӀ A2 )SUYJaF>%} ^#:.;tv|~B5g<ͤWGFn]bCkeGswem,L4.Af0nSZQQ#D C'@5֌k[dL| 7r@mmږu;"E Rr IafXT0 }I$ ?*CV8&(^a݅6<6k=)dVZ j /4XIhVGaTi7=…' 5=5X{'lmHEM{P3ZZ)!ֲ u'H밪6#iQjЂtS5ޜ0MH*h(~Vo UX`|BTָ)xu|k\.L񦄜OQ :=o׆T@.+*Hy7ŵxhY`>Dx4HVu"Xr$Sk j`gX#S`)"gJ J1ugf<7l<,u|UÀ ~`M@t`̅Utg[Ͻo J' PcnJ_UC\ܚZT}5yzqq>]@A-Rh)NoN`ȶ{KcLh%.m~=q #̨,]+Դ.)/P?O>nrݼq9oݛDNdņ {W5mv.HT &sK`)qaU+F#ݬmX<a0'`bAsKlMoaqLyvF]V :N1B+#i`ȥơ" 9N젨LDU+~]J|VK%'JL@E0}$n jBEka3oN{0HIRT5g ń&'9a(3ADZ)NSGӥS8' \(ZiaFhE ʩLF b{A lRKxP:n<}&8=4O,aֆvSl<9y@"6g4--:%2rq<i妜|87A>SрmP .S }R"c1ΏJ:(v|W=2!b1E!mT 98>"@AE t dfy:$8GCZ~p^BZym3۫~*5 u[-91,B1"-`rpX'{bD1Yx;zdwΛT5<]0x֟Yi@ɿ q0fz[w;I3.M>-u (RdKJYWWu1W W(XsM9DŽRn&X( Ȣ`)ԉX+\$JGƇZ]b*WHEPL)H V0PuhpPEOϊb@꤃.G)m}7ok±i.3Sۻv9N;Bݛ|#GOg-]w$CO fՑ`ӇK~77Az&{ 8vh jzO 緃owgˋf.ҍ=10 .fdrH8+*;aG0kPae,:uy?j2?Je4ZK|+աj*yGӎe/xC_P˪XT.^&bIqDYV ܤX[xNޚeLl)f|<"x;ڷ*}e+><&إ"i^4yza\&sDן;pZL?_.! 3B<}.J(dMmԜo1?7BoE|' O1a3AjJ5]c}uw:-[!pkgz8_!2TbNf2@2`zZ%bS&[TQbks,o´BJP ӋAnswpٮq,G\݌y0ҽQà¬PCuT>( 4*Fq/<2uoZ&b .I%bdFh>93Pkp䓋tbW=,u Pgw3W c]@'toP,mV8Yv |÷=:+u0J+9U'S Nպ^.:LLryå T31b"iZ+IyyV{ QsMtϳvs7i[ey\={wZz*_Un~LNKW>7ߜydyye:èdb$2-GTnpm#xu<~z^ Ryw] cMPpeځDy1ϡVRc6(SX8ARܻ^tL@C(r"A spJf \>@ Jgy0E鑻Axs7 3lܻfEmhQWx2<{}$-0t{Ơ?_ s} %BЈT aEΠF 4N=K tYxÔ3L۽&B(kB!0PZB3 5p)@ٕ?݊??d,Z䜿E.#Ea >"Bݑ.-ֳ_0c)Ei #72L"4F)4ƃ/f f>WvF hֻ‘1-v zo04ɗ&\ɖ**=ekyxf뇫}4`~+Sؤ»1ZKHM0&6wIx+7[ٲ45`YjO:PKoTzüxa{ambO9\jW1ŝZt9d^8Sg{ߝlѦ@([b4Znyj:p[6sşpmIJJk6j!#{','~Ҽzy/OxůVTIbL36 YL)nrZ]եRz_Z K"8mAV`)!fXtib]fcON诓_1֋H87Kjڲzo t__OzZ#ŴUz!&o(26jNUAQ * )VuA*XO]aS;lp X}oh4"׆(iI!".*#ŚJ1Z"#x`,ܝƋCvS^T[/&-"ܖq2eڎqc|`YWL!)q(&GzrQz$00Kᅢ^@ܛ^QzzzyRS%pXyWJ#JE7JEx´,+3Rm֘\LP'6ʠn1W[̪eR[qd:5U03VeKwVp\db 1W:9ȿ4Pley#ap)LI8.%-nJ)µc߂$ȳ9[w: u~Bɤ| +b 0E]HA)fʎfsH(*: skpId.lD/`ކK_|I+(en,>/ k\r)K}W]_ {w>>_Jq8]^w& 39l1iΉK8>˒F&?.='#YY=|f~Yq=Lrљ+m8e5T/vEI4OƿDwa-&wYgSK֖U[3e@ևܟ`,S|25=[9η i*A[mjX1&|,]r#aoKU`i8gx0nA6VEInRyͷcp]c]XÓ~,3`+J\= \]E٫?#xPUC76/?~OÏ`|ߏ_>{ںϧtӧtAK44Mݴ8 Ю9MjѮt3[#@}YNL[`W勭}68|{oƅ$B-@HIRT5g Ŕ]KOsPf.h/DS IϭmhvϟqhvAJ# 3B+lPN`dza &GdkV*z P_d9)OlkbOOՋ:Dw=j-nѴ6dȴWKDZ_R~iK9iq;w)DhPH>S}bk{^4&{',!b1E!mT 98}DF @ d؜ű2o+5gUи.6 h#E, _=]|?߇Q\15 u[-91,B1"-|J~8ҁ=1"%F2=xfosw^T@y%oz3*ldUv~7Ǒo"7 yrSƘR$U9W.Yƒb1d]RYgխ9ר1 "[bA! !,XM7KF+@蚆L4tႪ2(J@2k~md})*H7Nig1͆t1.UN^j;(fÉLfdhm7yMZv,<{˝DfwB.-?UT*`+97EK PKH*GʾX[4 l?*zAA7at~rm;3`z7EU'~GߣMsxǬNIiU傶-$ ٩]an/|( ,HE-d@ײaX1y# 6+S LQc0eh)u-dS.& dNN.{D ^ a0A%zN@ z KV:M*!sGZGZG!vP:5;Q{5대L.ɧMfvmj\?~W<ja?^3>Z?lw$w=;~ӣ?_o@cϛv OܜNTߣCjG}UvJu$Je4`EjDK:(z&^4Tԭҁ$P!bj۪wN98ƅqUӯxV$ O h^XT1H)dĘ.E(:{BZCNyy 2zxxp̟>a0Ŵ=4|zGzk3=޳=޳=޳N@ք=뜪{Ϻ{Ϻ{Ϻ{>#Syל]+r>gȍOՀHT&P!2#Ue|>Ʃpz/MU lgk)0{%w| -bm-)6x|eWGV;11d-?VLګکtנ^R[˴{|a m!*=Wn;%)97h[+0/ٓo+듇[;p=r륅X2:Yzt=_so\ד_ =3xHoQ&hq彣`xwQ)2|U=ݖZnӖ*(:줳ײֺC@4,jwF/nc .f-0Ĩ#h’K{ yQ شϑ_`T4[mOO[t뽮Y 5iu?R=pe7 v *0C"d7oʦtz[k>hH7Anf WަY*D!j0(x Kr%R0xdr)t>جIMU8#g(k n9?뱯g=]rz 9sFjưe4m{I]΄eZwbrH%x`FL\(bZb̑ryKF:3Tu&/w}b W+N[qbò녾h{4|B_ʶ3#1-͙V3x5IWJ%b*h?hAf4xa ,i R}B =Fc. F QyTTqjIeΐtRƫ|TV Mp6uE1&U?@@dcQ)9Q:K}fc =<Ч{cumMº罝\q$/?۾z4N1D1SV缇i;h@/: +Yf_ ^g7d]_L^~Qz;ι]}r燋[d6.r0rY`uǚ݋<7qx%i+ߺ//~]1И}_b!(y!ju8F5ZZB0wdM)Xx%搖:qVnپe{g٫*c]F<()B28Xc$AcR:;FQB  ,mtA HRLYIL³\PٛHYv6qt˖6WK^OXr-H.iyk ryz?nGl9۰⵵VFu >Zl5R\BT? It0?66^d$J̃:>imfR&8&) K"Cc4Ξ#tp /:e0<8hY&|@Z#E"hO ]Z& RnbFX$ /tTg_HWޕKPc1Y q5[nc1pr Er: _W%Dco\Ūlkv;BP.I-;NcZTX>e6 #wCu"~g=x(Nv|QF7EK@eDɪ-;@bKJSrP'>YK1( )3B.*W5fG6:$JH"* @)چ)בV`$NZvlgE%|ncEC}jqVݻ Clh f> ǖeҖ%}IPtF).i(2@$b!DIC FCI$S|(SʠՑAUPOS6>,z ɶ@mydf[G,5p$M"G/EnHg!zA )=& .^;K\qQz=K<+ lb%ZuKe4o;_?_qE-5}%adzm='+sT%l]q/.>~Kyy gs++EPHf6 *h#y1'Q tEL۲%M2(J@2k~md})*H7Nii9uyV Q̪RV;Y:RkK͇)5<%]J˓ޱIvxw%璒&|hɰB޸\BR9R@tT A(XX~` YTyG gVن{ N隸zx~ q4=t0/uMJcVBrA[TȮ0>Xc"2`|rkOblFlV&`$*R-Z!)2''c =quu}"/]M='uvJR =%+&tVI@Z`l###H;(5Vc۞{8R10U=xW,y٨G5ÐbbA͑jxqrW+Tnzeu:F8ܹ%jؤAE*X2Id!u)CvHfI>n~ 뼧OJWˣ;[V/B*hDq" Ȩ *6(cbC:&,Ju=;m3*gI9gtAp^YwH/!B< lod`{rwgK8/0 ]S`g#KDV,c0M6) />8Jv,|ԔT4䒓}e^5ūU +`&J1kN7`I2&+| J;sI .kR; *ͱ/L.b0Rh  @'c L& Fg|>imf}V]T)ըkuz㏳7֚ywM7Tg*Qt^?揳}O-Y3H!ܬ+gr^ - Z?oҳ!PSBI!P߮ŷ g~ݷ hŹ ``kmFE/j = $`iYr$Yò+en֥*"maBysq}wafpr;?}JK4!E }iw>ޕmH[}m-Z͛խk`9dL~R{CZ!,6|u"?<6ޭ1Kf#ncޮ{ɸ\-;sii?ɖٮlϟͧoq-;m]KXgK ַt l6 huJp'`b^GBГ6Vb]vR&nZБܰ?Mbozl״7Ar{O_wɌtjm[~B?/CyC/,CƗPmyW0ęlq.PIF~|?|o_~{Fo=6{wt@}:p{J5-7-9-9whWChowjٮ |3Yڑn|@%ᡭkO}>{H;GβaI7$1D@@mP)chuT潗9,c$=nº4yJ; pYu$9e+fEuV,) GKYKDC=)r3OWAWlN<2nDc9yĤ7^U5~8K*_P<~iK9[iy@)djZk#{QJhGHJȐoVI Yɍ*o坣S6MՑоZdV52D:(uk6ɿ1}0fyBjgϤ 0LJXF6Lkހ%qֱ$ā^>^y*8}-[v5xZ&%ƈD;MM.x뜡DB0*QYíN*d'~hhF?,X_lz yhCQ 5NѾZ;U4[g%/㼮q^>ȥb䑑KD<6oq8_3X6;8'g3Vgg[jʇ'bjwK\Dd8SS!Af:1 k p4^kfE4 1>})=8 yb#6tO{J=򍷑PK33");c$:'-u=Dut\v_F}^ĕ{_ Bk]eEl(C {S,/hClw_zBn~.A׫߆/Î-utI{#rr9ON$c—PrB`ًOJڸVHuH)p84 .(4Gԍ y9yL%R|D OL3DmSQWL20Uz萞βQFGυV()5; D锨ăCx g퉳:g @\|D!uuf~z~*Ho.U _ޑ'g<4Bxf\YqV5jѸm$~QqZ*Qؑˆ{Br,`*4S` F#4D3aXH)2K6~" e3ul?2Hn6dcWX;㾟4h'ͯ_.ˁo]+:::9?"8mJ\ )H9w :1RHMr)x *2z1_'[~qcuc;01b[֑߇ CDV䛄NpZR\7D@ԱDC}21F Ql_ GR~Z0iX 1G+BD&4F[&44!.%Z斅>=.m&Mµ7>/DYtvHB 3y|Ty$`A<IYȗFJYKõy]{"$u';mH>X81쵙co^jͬYxydzڡWgGVO7CU' Sǔu*t[};Yh-aN ݯc>;%+wg Z-qYX),4P' o`&8 /܅zĽ%Xv±ab\*~͜ i76n G6EIʶw*JX); ؖ+"8,蒌T;o [|0_5QO.EvҸhT )EN=$S =IEJ{f-ӞrZxlxl !w~Ln[cP1nqbpB?e] +Vj(Nyk-K.587z!lz2d\ǃ?tvGApMцqDfBu9H+ sQ!ݞX*7YRep 8;ws8;83R-lYh8+frF ]hK^V/T\8)g6cu~/#*"%?$;".0ޑ5e5DЊ퇏+3q6T D?aBw.~.`e Yɰ@r{ώ{HXS:h`?VƝYTB !Q?m4.uy}m0`s1QN(u#MzOSo4fVY KU8qm޵EZ;osQeH߭Zwg؇@ug3 k(UUE s$EWFa>f0?dybO&F:ZKo$y fUF-*[{he-eeweW!EQKG~V Nb:Ü˄W}9~aǀK ͦ]^O~hm2B QKjo U:*W-i/j耣hP Tq&kQUe܃*`=,X` p0!9{h1 9pڹ]amIrR9~jp i .O{卧_Hx]F.şx]8s_]?w.şs̩;=́ .?w.Y?w.şs]f`R8 e^,~ȭBrP5E3Eܒ[qkZֲĽ_q4V,?!kgJI"̚Ȥ! )KC_6ب k{R\sXy9Yy;+XK_tg`1U@tBkx$i'fLE}U!1R imHas z&b`ZZ6L6T8 "X$Ppb>Aˏ?&2̬@A1wG@"#H9c6)Y5d8a#甇}7O(OoٽFp*Xt@07wXfݍ8oqac 4g;t,;ˏOp^\#D'e{1mF LOG˗jjdVPN~X)lrkCr_Y)X m >q?^6N.it>MkcG{Msdvh]L>jWΜAOڹbnD`8, br0x!YleKBff-Z&ȴF> Y̻p?Oioz+[eV\ꪱRE|s K$ Ma8 7~^YB5Cח7i ţ{x6_zyD(>\1QԮ1>H׿ӧKO}8e@uՒ5U&DVPI9ŭ0o';CoNl4f_~VgnDZs -͖F=~z&㼠N+yҁVt"XU3cx-w$&/lj6jSi5dZJh#IE '\Z* Em,S`ĥsb>k:ik*D?@=Ƙ24k`YwFW.!$0rH eJ;绣'A-ASo AjYIS~cF~Jb<{EY*q x}Pϯί^_/ a3qs2Mcog1a )Z>罥BehuI1YϢhRNXmRBkYw֨qs+Jي,2vE" U.z6=˶ X$<_;؛S$p"#{Y\頄W"jfNcDm>d;wKl?%rWvg@-IR{"hۅJR"?FR+6*P 4$(rd?q>&I@}s]بIX@Qj;ME`>.xlMiӞ5=S/!ǛjNVRН?`?n⬴g%N{xS Ej@e8Df xO.٨% ՋZpkTEaNe•:myn~:torw-6dzq[C@ :c#Ÿ6*b"AdTS9oA*%c&sŸjvb~l*'Ćڠsy[?Bcۯo$7r߽]bҷk`e´~S2Wb0Gxc;.: -j u6z)֭@-A*PtL}éٱ^4CԵ^q;T2G5Pj!̺-LTZIX]u:瀳(5O"Z)Cb*!"tJTģP%F gqͳ!cw-lk~vvY|tWlږQ;|z[z3 _AfgAAx;^U-PF&/gQ[ ͊W뾜Qkٴo+#` D? YF6)w.T#fiNo]] 1>鰭̇!`gFgf;ȣt}Vi<<|^۽y0|qHǵZ65QW7Z/K3hŭƎKQl{e}޷ڇi63Zzxڅk':cj!=W{%$4%$rRH@H#L$F Γ+irrȘ0GKF=m|ݹۭZR#܅5l'`tk<;^ԳH'ͥv/ʝcѵk)a߂ c_e dHDjι 1RLuNR L 2nL iֳw8PW]eŸ2,oim iiMv8\5c q>YM Z#jĎRbz,p]#V_9Kp$cF{"$"5Q\q:qj@4YbH1F ]_лd37eC-6Ipyj#g=f2.'HH-Ҽ> ĴZ29hEɐ(2\NֆG]G@> ,ɐ QxiQBBӄEp{]Ys9+|qgz"a<=6EEmuMOERxȂ۲X / $ .pKp"ڲ){nt;\\=Nhep7f[!l䂷ej7MWW>ݡ浒!Gu׷x9|5]d~';Wr<pkΎ6\ Qtsc12kQBZϭh $;ңx xI9J<&0o%QoS'穿$MS.t=Q_]6yV[ ޅzكa}7&@WB$/x yTJpB&S#ģ\0(9D1%b%,^5B|ڜ|sQnY~eGϢݮߏ}-|ڤ.Oo*n*nN7TUQhH$Ҙaڨ5gę!J<՞J1p. |}w;=-.v(:2ݬy51fA:?<\wE#Ȭ<0yXjcb,]:DzիIJmzҒZl\Bm 1=/ |0Zܽ-sBfQcqsSdYlnpF'Hab&({|X?,stذ=t!W׃s=f KEUC2,wGLfb2 GɲSTZ[s'JX'z5]9(ߕ5'o{oϼJR*j9Ʃ94oAQ(ÅGe@b$m(zho-0!4O`zsu2K33+ͦU'\SACkB'BUkY+weߕ2~WhnI 2ߕ2~W]KKMeDR2V]+we&f90S2ppןr<%MVl!rc ~:[ЛVfj7ye4q7 v<}"e*i P5V9p_JA7*<;9ۖCnbr-ʡ\1mgaj_ս8;옯Fm{Wnofkik,Yjf!Yjf!Yj_xBɸk,5 ABP,M,5 AMYjU,5 ABPAQ,5 ABP\BP,5 ABPTf!VTBP,5 ABP,5 ABPC=n_n.e({=G _wgy?ㅣ䘍Ԙ`3`5謏 } h dL4R;GspN/+Ǐoc>]6ݴzhqFzn$Ny[Uk(֨ڒf^a[H5(qnкCcML(4dOEI*(κݗgmF$Gjn ^S,%щ'ȊZN) $%hlЫ`]D'rm4DJR0Hp:=1XDs"FugU7\pn7s'#-uY<ލ&6G˪GbR\`q12{eLȉ4J$%V1! (J&#. ZS@#;oMpol8"0i5Z>jKS$X)֝7>4A*RN)j41ThG\7B ug?`P4}vn@n۪&6&AP]"_Bb$*AI.P3o'`&GC$KAtIFr+NU+v*N cKp#&KS5R{*rL{i a,\Sw7}.k+on ہGZU݅ \aK ڿ2 ę5e^=ՈDIc4Mµ3^kYϬ+{E1Et4}F6#=KK̙E^IXОӨ /{>`LxgXJ:28l@}2j͘ ;U ĕ>(՝~piS|jU71?S(ŻYɳ4ؖl}麀QlM^uW;o"Tyd0$¬LJĨCJvQD h`lԈruK9 +.XI -gHdP.ȗ #4w?Y>.|ɻYV'W.,cy;vq_iIsKq1yd_ey?>.(qO.5Z =2?o8D|s$8|nC%vvEεp&۳ m3i'PR@xzsxwevx6͸,A0S>-c7!V \Dky}znA0Hf4vJa M3eb9k׉ \>9 wgLҽY|[u=}fz1úa39|eg.ݷ( v5kDxʹ%{ⷜ!"h ْt lFlк|DF[3g1v1ѓUt2 Z+LZX`$7,Q|: ƩFx$hv(ZOo53\{\OWd 5U/.9__C>"*^A,tPJ/!Fǽ0@ ?ӏ??~|>=2mOoϺ`싻ϗt@tAMQilo47b{ӊӒOjYr~ٲZ.cB b&K=m ߵ'cTy pvZOc& SoIbˈ)w9 Pk+DTGBP *Q:ImXh'ζLnFfx08_sD(IDwx|IQO"3.NCbY.&dY^?//9>Az׮G UǕ aO{G?m^/NI 7(ϙk%LXo>0εYoiE7Z_BƁ|"Fu#Aw3_s$G Gmasjxs>Mþ_KX?%M8Cb#%7 ]8;sfbvv0Zn^#ag~0=~>i]| pi^z9y#_W='K GZ>罥Behu,Oq k܋jvD+Ħ #X%tʨ=g1GиbhVT3*S{bE)bO@':&YtMJ|U^qG.*ug |bq*V]W]Q& wdۯLn'a΢ 4n8>>5"F[B@3 p먄vH%T ւ{pP0ز$r-.2½geɕJx%f6161a MCRٮxלSŤZ[Z{^_oڂ-T9TF@fB2q>&I@s%l6-@GyES`)GpyR`B@ ͝"0D uTUUia{Eٻ8WQ" 5 6c#!1fٲE,Cy}S] YX'G>g18sXwZfXxڶ,i79RyzU''ѲCm}R+ [b^w'|N[l1LA3 oˮBqF攆f4eАWԠ(p蜾y2x׮,pr+p~c~7URfA B2JY )b/dLTO&#Jn)-ź>_}_W;O_eE(꼥*y:U8|n8ysx}I~O~8Yx23+ɶߴ ڣ=ˁż3U~~J\7ԩa׺;`w=CW.^>[ϧ(nc$RZ=VZoSv׭ /ɛyAzj|$$p|M};@4#j\1*}fa޶w9>%?XoO>rɍB\{\i<קsjt B!K"AƊϏ.(s|3ox{|*hG_ˮ}LᕔEw2s]gDHv{5]*C(nG M>`l6,?"#;j~[f EL΂X~g1;Ⱦ9^N<ϋP"|Z+[,*IY-RI[yH2jSޏo~_7&)fюҡX(O!Ec}J?Gjmݜ5ArVeDT9挑s( F[4h֢ȵ STBf#L}dKin ^US59Ư#|2_G H"Y@-r,ih,IZY=⢗>$%UVش+!K̍Qb&E ˬXԳRȦB@*A+ݥ$з*L%Dt Ѹ hސ JzFb[Q@Q lzZS̡mC :8QvHX56®ėy\˓cmV:؜E"ujFcqhJ9d,>E1x ~B݂f"\"U*VF Hʆ4j0Oq KFc^/u H+h#;6B̀j@o]T? W c2oG 1Kġ7E&܄>TϻzWaLJ. L tfS.% yc >A.:MZl@Pys3@hq(Hs4eof BGzv-:o4(\ $(beqyzޚ׃J/%i ,D4P(46jeb $zbڰ Wmk?Z6!h<֙pGlukw^LKQDFI1c(QBucvH')AP/{aFY 9 !\;Iywi[2ny]y l<L`Z:pԥ: l;# JF%H-G 9Po,z78/ZXP脸@jIPM"eLB*#@&ՊN%1Z|Oy ྤ" Qkt3;76t l*X:(@JTU UC@Vɒ3L)ycU0 vߝn͒_GUD|ҷ9J>b=^EDB.`Ajx( 7D. .B @5Bi5|1 hcJEI9h @;`'Wx %@A@!Mv%K |0bF F,nZZ"΃xV x"“2%Y9vEQEL$Eo^KT`Cbhm7ayMat>;OӊaM{v>kDycQ n\L3JG+Q40z:60n}OSQdf1jS֘k>Hr9-Eon0&Gx|~4L@ŒТ EzwCR^` 6%T2pyZmqn@^4)r3:P"P`@H H*.mP􊀔kzL Wc B3M7؊DRe 4IUU;K ZeDy apaO1 ep#5JEP-\GF$ a^'p `NLUYnj$hHh*U|RUgR&ol0 >Y rStr޳VKe|+Mx7L1JH{kbmQyc &X)*:LZ8bAP+Bb)Q*`  ^S{(0qK^J Fʭ@\0 u~ԋN딘FJRD$bp(h/#I!:Η.P,tSJqc2+еwW4"􎊷!TQnԋBv֗kqz}': )[7ցpNu_/'Tjq򀡯CF7T:԰[y67/V??WW_U!I  v(\o=Gb+UW)@껅] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@] ԕ@\%))`9iSY@R3*@=u%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWu%PWy@^H2OI OH sQZ#>z%+ J_Q~7U(kAտ~WlG'Ggrz.}00(]ʣibHA[.?߂wk<5(<^6PUU*z?8q(E˟O;'LyK%Hysqd12%=ubl]mrsx?>;jʯ[G~zAxviV [l+2ep2`;]*RԴ FzFb I&jTEc\JE]26Ez/Nz%u.ǣYI|31 E4JtݶY~f]~0*WmE).?FRd`^B6E?p}"zO]uW2@Qe J(% r!iA@G(-\FWrudllpp h(y%+XdNL)(cL0B:fUl*G)GU1lccxhx0AQҮ9SoJS(OIYKt;"7cԫ΂:7^넵N@M9KfZ{x;/Mb._At6z]F8lwo81{ &mwnh-q#`W0 F\+q#`W0 F\+q#`W0 F\+q#`W0 F\+q#`W0 F\+q#`W0^-FX*f`%r|)qZɏ#.Qt1@Ɲ2Wl;{«hJH` ̳TE*mC BD@Fg,6;7#vGLg>zΐҚ &pj2dM* "#n L |`*k0Hr0bYSrhSc_n(ǜ=! zFinVVWjz޵y6SX&8P m!0t87_wg.:z~ {`.5Y_ʆ;MxRރt]چ2Hp!uĨhYVpBb'Tn"eΉVz|<b71==6E1(˜`sZ0& D(c`L1Q0& D(c`L1Q0& D(c`L1Q0& D(c`L1Q0& D(c`L1Q0& D(c`L|;C Ka_)t%s{%~Pۅ.~Ц<^Liqqv( O(7*GVUX"uo(g[?[jnInU!ʛ]}B/z|]pA2̅G KwVp\LuI`Qq`+{d(k"m1~K究1w4>;y-z4Wppߞn.0n@׊:R>Fx 19(`H*X]tq v#qvC= uyVn 0.߁DA\;cA  /0J54X4֊?onjWvsM#ɍP bcRQN,cFKKc*Z]=L.PJ8#iQj@Ђ@tS5ޜ0,j(~ ,7Ʒa=lη`~,ԬqWh^a:OgKxHtץ|ѬxaO0o0RzD X4H+j8"apǐLtQ9Hy3`$?"a%:9K,4ŽnN)0_%P0#  քɥe.'WiX"g 4X~``>NOfpiZ!Ҷh_kЛ7ӋՏ*'aՉ۩ 9?sS?8#L>.ը479Z# _'?^Mg[Ur\sa 8wkKrmWDz\t5mQC[,(cD#] Cza&YP[+V1zt3_f9b809Ymu>ɾQ1Qw<ҹ? O\ q$qO`!T/EBRa՗qcrftm%]DH|T h<ͺh=?U;nú{bCSŶ-(9ق0n!k=~wܤY-u!|H1B}Y%ܷ[UVBYo$"oH"r )5) YJ" u wD1NasNLhʋކtZzLDxEܖAJ3*lPN`dza &x0Y`5+} DkLw=*R1=͓gj !b1E!mT 98}DF>)CȬV]ΌV]6r[u ' {^o*qvR5%yY~_-=6x`~Kǫo gq{:GC 7"w ;o߂gٔW#O ; eToe0l@<£<ZN ' єy^llgp6iM8BsLX(GQV9(cMwԢ׷ٜ=9mW[]Ξm)]tYK0,"edEHG['p PSRPd9J5e.7k)vƱB]d^eႢ4U\[wkdM㦣ƟGf%@X*$(w ICU[}i " <{hd 3 j-$M*KD{CJGt%*&h'U+\lv\XvHQ^=HlBb}XTDYbzU04`P# a+q&u2"MYaBg ,d 1Kah k|@N@8mJiLfy5dI0MkEZ?+2|Oz2ñhGҎP t Xȥ'jp4U- ƖT%->D%T@eXS@"F+C{<Ű ]ݒtoo6\4UrnA~t9ȷWn-Ⴇ#z!Uxd!R&R/5eDDL ` Xy$RDկ^4H|54|WunZ~ix *<_?%m^ZJ8>@cdYVQ ,Rg2DOL?{m RꌄKUy8J^brp(+)!) $qْ=tƇnf Jq&P82Gb% zVBo+Ȫw.#=-?NWv-#javr4`C[gXO{^5+fSϺz+%+-~XcYn),[[2/[In2|lTl4ȣs,TZZ%IE_4l+ Q@PԴq8 <7Rس 2խD/:o:M_2{uO?՞QBLq}md_ Y%o+5,{"a2)%HVU{p_#C2Sr"yx+~^w/$kwcrr%ezz o$e| h|#I .(Nf'8 78('=Q#(c, /D<b\?T:RDUK]EդKh&yNtyNNtrT>Ye(j{GG%C]&NFJL 8fj#Vj,MJ(EЊRBrhaq!qq,tJ`UղTMhJ[blQIla[hz[x$c9f ivqlӧoJYʎF_GoqbK!3DŤ'<(T,BLޘh(.ZG(p4ls:?1βRy NBP#$&"J>&.]8-%3\RcO-HYܯ=m٤vʡ;8Q;)k=:Z)3>1iF$c(WIHeӂzȈ )yThkB$^ȴFօ$TGema{XLv')aO_j~kiaH{["\>M mxz)ʃ py2Jxޣ@q(x<ڷlu״^:igaPb*l^"Dϊ'/fTȷǬN' 3V "lHEL34Aɮb*ֹL4Rqyrv<)V rDI5'0qGYbD"JxUFmse(D:υfB)jcADxpV[8GBd(\]P`Hb[UeoH&ϒ }Kk]a۪/Vwq}jKA嵼}wq]BĖ.}.h IČ)&iBDnaO>m9$\*?zfӊAF{;y>OeLȔ<^%)XHF+F4ZT(T$XO|-4WYw-@Ƌ %XbQ$E+(QjS^s )h?O2Occ-M#B  P#)pk R8j$:9p1)^*^ہ WFX7-"#"A6ނG @.xhI60~wbhrY%G5\Yo~ѭKGz5z1]Ww!;|_vM~&[A\GCT}0Y4Xq$skOCaPNRsN=]L\đkZhgU.G0׋ 3 qy}Ri1!ZG.ħHݢ~mܯB><|h>ւe8%.N{4`hyFgc> ^.Q%\sc i4-7c$oxk|Qik nںaZ,҆8`żߎGO>9ox80+#{?d[mJ)ϽNCHX[?L6 ٸtp;ۂX{Sl(j? CTߖO<|i2*ys%xx?ےYFhafvz8̏zR;nT3_Gg~xwJ'4_~?_>)ӟ?Ͽr=.Um"hfuؽk ]s ۻ|fwwv #~3ZoBmB47b ;Ҏ zv8x]qa]M0+cEI(^":Ʀk6T7sAQ5֭~{P&聣_#q \A;y%A>xgO %p &N "TW)`XGJi:q|n37m)] ۄݔcfLgD tqd6t4_[zJKu:׾k9_,M9E@aPԴq8 vjrAV&\YI3BKKdTi1u$`ɃgOw)Eyge[aSmIS/^R S=DT{ETڜW:^x[` I(A@)yճòi]c5b"%Ke"JxIo'AG%aP[2RgN/ bMsX*J,(ǸָDpYH 8jܡTe-J<+ń.oCޢȜn:-pSc7.5űwN̡6YH\[0̗@FI✻ BM3~;Jr"-u >W8OkBǖi^Qhrfyii)•2őVOgR?J0X4p!8 c( /u6F%CE 58A :gDB0:QYíN!0Ve1r],g뛺xI`o*?hWm˵{tؓbd3fمw=Z}l& UrL]ue3W SxMڴuj]0iʰ<[gA޹]kz|׸D⠶絖p=w׾OR?Ϧ󞎃ZN=gNZu}5M;iCcd)oOG6!h6!f#k^txy vD87u%($P3 #f-;r˖Ѳ}kݫ,c,RFswK%%!Rx1N)5=I&XA')olPV@I 1I=N#z6F 7а_LMwCL>ϔL=4\} l_EίnnSu pf;SVjiDS짉:nNextZ1g G0 F/BtG>&eD'l(j^'W~%qvgUYތ-|lP/OM!.tO tA&6 %f!9S14ha$ZB0ʞ*gRe6ra!-X\.gZ3&*ybK"_Y@s4Of޾;YII)V%Ѳ3b2J`13ݧ?n&܎rzQ>YYp?nog9}UB{.CYAz'染,nk_6c`QsZJ >g; LC>B*QN[)Cm@;9 jj1 ~O?HKHE'rҙ*1ɜ;!,KBxQ-..ܫ{KpӗyMВz[#'B`VFÄ;(Ը]uە$wd,9cSԧb!TcJFhɽK\zO"XCɟD>t'Qk$F]=J$=.uşFqqnxf?L|ӷPMa^ 1 B !JȠAZg1y\2@U`\U댣4pArX M#D]P@)`Bf; ϯn'a%juro,*`_o ua+ iOviS|^bQ*Nh8kTLq[_rm!{(*xcu+ZR:tXh,H[h D yqblcpW0b `ѽA眝U|X'_̐Ng٨+R@b*!;D()Q) Jrc@4]#znLJ  {_~ﲧe }u fQހ~zzJ=VM1[DomeD rl唐bNʻͮ?/ˁ^t|8#`-ޏ ѫr~gO 6 >vDZAZ9kQ\aa XPqe%" *db:r%FHF1g:a-$eK!&"5Q\up-#R$0k2Ŕ~8gp.@.& HJQ>14cZi++FN5tuOC[T9p\_rLY"-NlB11)5~B`!d5er\ԚҋTJ&{Mo^G)ʟ=ܿZ.` <EXY j%h 5^FkHaA X4p0Ɣc1\wjb NP*D,N@mrpkShǫp1rxY`ϖ75-fMgxum gތvnY,eagٛmfjY°7y7Ns9e+sa>et{u=ܺZsMsbg[Vزjux]7ʀZvC.ӲmwݘWXh*d2V*2nJ/IUK;D fqk=FZ<jQ\/?7Y}~'(b=VEfzZ#)䈪V<7ıe  =Zai]FKfz~ܻkH_-WZfMr3Q "qC:&h0CMК3ę8"YZJMGvOO^ aGcnK,ZآݺjcP[3Fsv^q5v(H(`N07,QBk;(xƴ^%-4l~DmK,M]&crǤ6\ࣦ{^\fB+,T_pF{)cbȅLB}qo]86l#mz̦Ijڐ-4(m@mg& kR+6i"xaoѵ]O VP/tՆkb@I 1I=N#z10Z+FN/,UW󋩿z;t}癒Ǒ:Ik-OzXΤǿ#?Q~uu{%+;-܌sgtJ4 AtJRbAEJ7Re,rK`Rkjc x,Xp˨Bdh4\І`He9? 8PwA@ȽB"DD^lJ{a(EoWA_ơjU6Vl9$Q>^pBRst>M` T[gdϖ1̏2vL2FQ46jn dP{)ES+SɀԎ$ł19f ӎr:Xv|NJUڛv3t߆4_;P4. :1W"r]/{5yd~~9=;=k\Ӝːam3ԍ1 Ezɭ. :[fY.~2T7suᒺ5|ny.q=U7\f/zrL 3@ 5r?_~qrwH.7)w kw_ZXޯ97|\ o?m^&Tb !wϑzqu-5奵IWP珉R`ߞ5r|2#oQ jd'4 [F&7|XQh^kmֲEȗNc@?d-0hZ\`O[,4>GOے,˴- utZ\KƲ9kWdrGQ(σ rGUZurV0$.Y13jarIڙB3'M^Y]գf9W\?NKm ظI&M.1׏=u38JEc #@34 f/3xxH0=\eY,Z]0Y:}>cG*j2g}Uv~4wi,)"qwtpC ɺ>i;lPQyǐ9fњiى*{].k g_Fϴ0(5#8'|,$Mϝ `yAˌ52piD&}1gis8p21!spVd⠤,Y"}#8Y./e/\Hߞt~L_ ^98!ᘚ#?}kxrKpɾʴ[t\9ΙG3~/O7GLHxx|۹HaĒG(LpƁ vtj4pt{#ԳߨYrFU_`,y"sih?Suû8orv?\L㨅> ΅kA}oy2Ϡ4Q/gGaoTM#moʷUL}u⌖6ǔFI?y۹RӝDn:FgZ$f7)`fƜ랞m6:AhKVR,I%*F,F`yC6'jҊe{ ==TNZT֗:RWՕs˙lΑRe(0LA8G3.}EKaDkv~~|oRMs۶wp%!U'Nҡ6\w˻N_7N^$@^Cwo?{Ûw\Wݛ^: (ZV&zǀE͋FYh*oZ4h f' AFլrq¬f8 ޖ GV+ W wD= lnat3wx+Zc1z"Z$̲RDBK:d up{a.M;}Ҏv\/eo8C,Ləa1UdYR7#hoC*:;QmLlxP_C'b9:n+=dW-&ܲd˰Sx/_!)e%!q H"͙0dU<\[m4*ju5W t\[ΓV$ӌ9J`¬ 昙CZ h[jiS&=_Y-nEl5WTʛ &y~lNǾF3│ǻP 4qs Fq˙Ym.䰚@Nji&2y'kpW#FhX2R(8 4* k-dB AD8V%F.rM6fuY˲"b{CцAMs;1BĐv ieV|LՖk5CjJW]/? r~"x>OD}JC>>ՐsۮDׄ~wknؠ~%18dhY-&'&W3}yfg? |_t=u^;|px;k^6َ`&%.R? =rRyob&sȑg A5@gm-a\wt^]LWؚLRJoKF $b7QK:k|rqM} YR3E{&hZF#l4=6 iRf`Z܎~Џ 宠vѱ%j+V';4z xB/1Y& k! ̬aV(@E' $ }Ŏ7\Zw*K14Zfe̛M\SjNңGe_n.w¾V̚@ͅj&dL%v7}νCZ;ǥmqSԻ婁lS`h7Y$H`F& N5+ʘ6LjM;5IʾD#E//&e-jM> noTUL{\C»hd\5 n4G%wzL{vX]ɧޛL{ g.g-rdҲ?;WN:ems[ ^,uZI:|}P/=,źf:=\Uv9!iVHhEX@M K1g&j{w7vЍ/.:-&68~q+/2+:tPܹ«P(W[vt. 9/0[J fC?w{}Q5݌NlJtqk ЀN'7i2&iC.lR^KXC|ː djV"zgkdzl:kE)=Zhiڳ}._{ gBw*P|,foŋE^XŮˋ+P^\ ɭ#"*"k]b%\=C*B'\\,7pe2XJ\!&U1W}+?4pW oŖLr3VLjt1t®4Y_L ^1ɔ~=;UGm%pu7k d;g'WWv=R/D>\ \s*p83+Иyן&jnh_(:_?'yN/1|F_tB1$;dk :;%d)%&"1̓}on[Rs?<\akJJA)~L'c%2 \ꐽdBܘs:ĺw9LHegHA ڨFA:Z幉dG_;qw-snFqCPҜ5N`CU&{3 ϛ֝0l&QU4*~:O$uxH0.c;#nfGp6{`F`FP٤,ٳt ;\O=%!ET^ C`mUZwJJ<fYR>,-Brϯx:AYbYd(+'Zbju:WMܢ }9Z?_5>ts3GݚuXs9eo [F]yѪPGB>zT36`Zs=aR#l~mrMǙޑcwL@|e#f$^9OTǬꏁ*yr&fw>Jgsv~&1ŽiЫA7 \"Du4(@.ZbL9נlV`,1 KC(·(zzӕZh]h0zM}-gMJUcݫjsM r3NʍtP"L'ܰ(ieMP,%Q( DZ2j!7u䶀{z[6e;pm&ZbZ fRo=|f8;J/_.g H ``Bփ+ /,WA&͍ݽZl;u+Dԃf(n&py=v:."crnM-/,d5EWt4Y)s3>qoٺql3tgo \4|fkː*mUg&m1JdXq2nŃytmDNb{hxwcEMFw *;?=*-D6la٧.c(6]"+XE1SӫDϦx^0OLԫ"q#/P;cǧs= _oh6[H B_)VUL>M.I(tnl-AlqJ;:oto>SQeoE?jŞ)Qoul()Y6ފ "mt8o.KP>IDB0,c% k#xIIrFF/gzA%9)&7#g3LŔdkWq tzm_t"C="uoA8!{_T:G3;Z|63 bELUiJ.(\|Ae$EۢϖlypYCn<7΄on槧ԅEm9Qa$8@fLNA8FQ[K*`PtF)E.jxh@|KS"mRZ YC  ,cARXr)eBUH,Y߁׺YJ"+p?H|d ‚ 4a/Y(D^LfU&T'`լɼRX񰒝,}=hosQ]Yf]ب!Ԣ1)c (UQa#J񯮗hMlF.$3Bڅ |8Yy"z^ˍo: kK I35yQH颼5%ZK{*+zm)grBP9 WJP @f BT p`U 808ˇya1º kԘ$7L21Ax.Gv|9F)C{׹DUlT7`CMCP1U7`a2D+xsQ6]EKp Ǥqwބy#ց-o`ԡx6ΐeLWoAFt5loꢊVFi{{77O(xɃoMz]*,{*>kIu{{GaM~{ Vm5 ~p0#h^ -- Y}_z[ %2)ZSR~Nq_OYϖъg#!r Yx1'+r%ۈÓ/伒/Z^m;/NLZ^_?-in/>$QjjC.350`5G}/WfTyƜͫ5NjfwCڭ9&e<:˧?>'~/x#IS0)~~hVCxˡ5桭-gX-u{qeL\b+=o iOz{ݴ&V>ﹷ} Ih:Ԉ󘢶*yoE""4l e騼6HzioRt3c<KqE.3b1#akH GD6׌Xd4 :Muv*/n3xd^->zSv2W^]iW KֻK{?klϔePX쀍.j:+rMkQWIwFhfɽO/w%`<2`"`љ$'g=To~6rlCOLsy[q5[> gbYf)|I 1&XV҅lFtfJJ%F&&hɲj@KGBNX WS6X4 +֚9[Zk.l U(s;GD=,xWSZ+ 06|uV d^%!`4 KcQH&֨ 2T:FUU:-$|r@_FT\9l}ڊ;L5p+ZwC>|6"]8kJYҷ~|+ !BG_)zsSyZIFi&5B|_:dQj@ԐEhscfc ntmyQcғC:(98 իL_{휔Q@HlQZ01 Q2%"L" Ҧ|p| $>#;o'wXZrf z >1W?X"^<ȭgoZ6my0cUQ:O]3x{G*.6gL6G-,72U[TWֲ5KΨJJmWWLWWoP]9RWL0I3ꪒ;RrUr1o^]uE]-8*5SVnSz|un^x>axH .ɭGϓ3}A=)@)u 6Jϻ}9>O*A /smeЖ`=`-៖oc/T|Igޙe|;r_LЧT5I-$w-z] ښ|_{ϧt{&6STȿRM!߾"uhiȧ:+&k/ıkJlFge٥Q3]%wI6Ib׉c*H(!W vѼKfW 9v JbrnSliF˯DQi+Qi,p֝zIƒ!u%W$rA]QWL- RWWoR])I(u`l<oIh/'%;Ω9+_jMa4^-(-* n"Ji&ɾӒˏgZD X{>鞾zsNFuE3ΣUV&-d`P 2aV2hB[?'LINe4TJ$&g tp!x]IY!Y ,ɒHuFΊBl8Hg`fӈRP?P%VL.CR3u(Lׅ̂(]QJ NY"pGl$ᔳB*'_0I]T-W-c4u}/c$clf3gp+L%BmŨdL"R6H-2E2vcͽ}ղ/O'F<ntj.M mD 8ʜO̜w Z2>:L6)Ry-9հ`(R45׾t3z싕gsxn]t ɞ"LETQĨcx1 «L9R@*hlfFP@.Zɒe=%1ԱǤ3rVD30~"d  K^/Nqmgi!}#t}g15r|2gtyA8 ͐Yhyaٲ ڰ+R-BFv[2\d2IiYw0z穄,CqQV)F{E:#q g}r|12#p~mgM[ߦTXnW-W,Ɏ"d|/&cg aQ5,:fgGA1¦X4o3/D i/,*bmC>m6wdi٩xA&(iQ"":HUd#Ƙw mIfG)x.a}zLˍ7K^|ik@XE. \ m@X5I$%n:qPoR&EeMV֢.S-IPSٺ}e,h`vs_$>"Ao/dQ.IOH/xIFEsRRҹܳ7T"^DR(I{BNJFBJ[sA2C6R[AZ H(֞,"/u,5H :K3_1r:؂WQns7g&@:6[M()\[F,FcVgO3"{`*X9,3uhNnq;y?Uʃ'8>4MU|;ih_Zp|v~Q#CE [oqt 𠞀0E%'|O0jr~ ?[W?&W~j/|w9>~~!ni;'9dl5]킓7~3߂KZҠxKg˚QK:,XhS [#XGAj6У6ONtí:e}RuEg2RFf /a^U@={_J׊qj$sVMk]vi]*R/ />-UC(@W so.&{/$?~?|OTzO:q2mL?6!&i޴fMk7m_]]vyInEVvYjUSFhGvMsپO-Cohyw7j_5U\ ڨDDQڤbkٔ)٨"{#饽 3Ҵ6yf>r_sSG` 9Cvm0L"9"0Os%Y)0kaɦ(/7dߥ5Kh'n8nō; :y8pu|n}D_9R/wj;tYpYQ@!JdFl ^x;)zdwiP-ʠ#6K{Qlk m8"_QFc*#QҔp&((#(r$6f2k^gRIYceY"iF!{6|fyI|Z1:͒E/_#JfVG#'ĭ,+](R 3a~q~XaRh;C2vK5];=?m,Qܨm' ؏G9g3y~[㦀ܲp!E@ ţ)Mհ20e"%CegnP.=[2FNs$?_X)FKuVB)[r90i_Di-V}nJ[%>~'wG!\CO| 4\K,)w}:9yi ?'}{7f_f6yVN;f3_? 4G@eEg wn'=Lv>8q)@ 7H%c~y+8?z"L5|qg0VF]XC 0%LA $40Tk'EW]/-<(s`*4 U#T[mm-g&Z;ZUzUO>]wuS#f{z mvX;(ٔ/LHiQ>dg=fGdUSD)1)9@~gGUMJhI5y9 #rV6Q{ ~Xy[6پK]Q.4\Ԗe}UG'<ϧW-RL=qF IW &@F|j֦Ih'ԩd7Ét7x]cqvMvczcہ@33oxoxn0oڸZu(FXm=NL~4Yދ&WuUL5* x"E0q=[#5_mk}n5= :A*h'S W^9,[_m@@ (g\E-q xc7} [-2b`rU9|IM`+"bcBwUGv(Yh 8-~*&zH*84`ɁQHj@ٔgXj.Q'OV~U!(޻|{ˉ<_@rr޽~qho]^ C˵a<$i=IMͷtzfsrq6ErEw]/;|puyջw׼˜ߩ\vۦWM7wηbݵyr!gAJ,"(6^w/YYj:DE),lNll9; $>j.R` 6sj./.|1]\5 3%+]r2K%ۏ}~iJA|\O&X`<`VS*->X0V/H5岢 zK@0W6а_Åi͔u촥t۠iWj5^6 yc^ܣt/CH{Epcaߎg1Z$TGD2e<*XS&I)CACec\zl PP}.`Gϓ\@CASQ!bԱ z\u@d `WbQ1cOk bbP( H+jƉىw4t<ۅ? r>{)c5̖6&&VΰJHAGqΛa~>7G=?ƽMld)8-(8o..tK\Cxy;Kmw- PCq{(n|9_+ɇ֤sqT!q? `5̦ʫ- 4N׈L}!)>$ŇCR2{o0?]/8(8-;<]tSw{QUNY7Xȥ2U00$>B#? e(Z EK|T;4Ԩ9P#ygtK-`j(_m1+@䃾־j(TV^0bqCsT/Bvv/em]^ǹ3o|6,WZ۾_uVoVvP⠽uN$},Fm 29ץC38Z :ـURmK^GfmL#ωB)R6@T|еUtΎw~ RR4yEr"mvSH"Xbq۪"`S  ]PWbUK\f%K)k1k9DƷUyI%. y k0\bŢ}Lqv,;&bN^8rdo6+֒k IUgkI 6vD6vŹ%ѻ]ɸɿuhe7$F:0@ )@ʀ=QlV h?v6J C*@:mB1ۼiG/Ko֋kӣT@oTqT*P={FQ?sEI=835$ NV}ʏLLDUOHE%vxCK2ןEv,gҜϯG7.b䕠 gT(e&694d.'vfXB:H0G&֩R6T9 T7Qf{ogv ~g;nu뺜{o־FIM%͟)nj+F"@0f?8f+:?;d*L)8rPJ\a 珜wtʰ9^Ϯxv-yVE7Yޏt8dUU&P1+ؤM5Z\sAm;bK4YJJj01n~<=>M.KHʀJd[MpX  m0\scbD֕lc=ښ}!B%E_5G1nK,30 !x7(TNߛ?/Z=C~j~6"v1"?}'>gE>YZd $Q7(Or Q r_FO$h)eVOԧ369M??ͯ~aZMNqU51*z泀9~zLn)|y^|C}ON>ƛޕA.Q5(tzz2:YZ fr/Jau iP8unqF1P{+}jJf#Zc]l&q~Ӡ~1nv鐨7  >Г6΁VY꼓mR&n[K8_#aU_Oboz6NnAEM~Dur0؟̶| %'6'0+=rf՟|SE-?I{{W%?g]g{uo~:O~?P?ه߽}o^VMm >k?~پi Ms u3[D.ffh_B4{/b& iTB΄7qsVRՏV{o\Ib<dA)KI"IKF|4AySa.U|Ӊ<INٜSA͢/YRP!H_)(RȥQE!+6'vB|Q_" ٮ\׾N/&.}QJ͈FðbMjRcG,j7u16Mjc3'BG#e'5MhBds* x´22C E{bMKc~ĺD lpٌR?x4:}+"ˆH;Dq݌TƘ !Xy> *Q9arrrY(!*ƢR"n|( X8XdTGMVQΊ"vY\tYl슋0.;\\kǧ >2eHp KC 9 Dh^&+P"CaI̗"tx\<<,6;vC^nxx[{d/=r8G 3tf:g~4BI~0ǏзC⾟,gHD" A*` i;W%Y(ךYQ]ǽ\A- פlf@҉r~$G2gϞ~ur)\@DB-!O$сe0Q"NVظ+ԊrZ}\m`dPi!5X.oCyכ^o85wnm~n~>[\x{s϶?]1hl><S 9܂-&' #KڭTCPe)/G㛽\\_G<> 7q?q)OZy`iR/S\dp,[GY]`?oM6;o ~PS ""?*| m-l|g;/n564hwۯޯoPqY͓juE͟f-~gz=Nf{ ~7 q=KVK{=,{7MgcA~!jAכ߿=^p v2ĹXk叚J <S (RliYE OL3|ȜS_ :^2XK/A-QFl`PRj"QFLqMD$4BhS= y4GZa.A~?Ķ~iAyXfؖ-vN3}&͔SՂM^v57Yo~GE7aBLvAk3ȃT~į\N}k8/HFSgqz):5J+9v:K)dS:a4yApt/U;\e) [+.9 }oU촷kȨxa#TS{Fޟ}L6m@ҊQRF2ro{ 4=} JҖ"i1@ *̓ vI۝i~1[#U,sR>ƗZTzP9=WVpɝ6$,uJs 8Ab|uLq_&=En34솪]x Vu*UZ?)yGKWN6T(~wGKVXrܶ}^ͷ7W0-j^*~>G5͟06s~(+˳|Cn],zMԜw^t4-AZ->%__NhWRkn]Uɻ*n.j,{Q|i{wԓh*Dy^ ,gq9:M]cQ.= q|X;C(FJq&<$sJrHJ&$>$xoq)@E(0ǃ@$SB] .ļ ڂ<X`qVq%W XM r3;`Aq @D :C g ׄ´!:OQ`LkKcL<g A~T/ObCyB;|.Q>lx/yBJ@C(MN$3!Oc!kC4<՜ $p[$EL"p/qɌ;T;)<[?g\A䙑ůz984X|<2ʘ!I0G32,T[Vz[*hY!MB=nsk,ėJx^MU8ay.޹%|4H !Ib[bNA!+x)sIzLW&= ntC:SP]w &Oܚd8lS:qtact>OR{4,Rc.k:.y8.Q?tlj+FzZ>OtGI>XѱdkW D)d:9o+5NDhdR8 V VFZUPㄺPN#v ?Нd))T\1}M$jIxĤS2FT$LZ{-v4RSx˹q,<"JH 9KstIwS_Wsr?gˏAl22 ?Ywczte0g-:*X&1zL %tO'w?vD|Z"/WVDTu@D墶 @8BI~ttbx{`2fw8S;yY_9ob>Ĕ \tnx[7}(EXyn8A ?m]z7n%x=g ?͆ɽbrB WPXM%c/Q.T))رy+-hs)p` C1`(S,y/Lr({ *{xNxYwv4НNr?9k=='iy]][ۛR,^ZNtZ .aMH:+)'s- HVl&KHR:*zjn/Wޕ *ʡ,]hv ~r/nv9M퓌I'OP uUXJEMɩ7.bU5}=ŨRAH)U}E~Svoڪn-ΠL@Yc5wtВA*חUdZ$^FB):w4E~h?8<>>YKpD . o+)y`AR4"ICǨĢkaí,2;.VǑr.wam>U{Ax ͘;hزڲ/(ZfhP&4%Ji !xBX"X|RP)eՁ$ZZugߊ9>?=MF7+_/0 6EoYn+,K/(j:u ҊRI@y?l^wY&鼃7p~eg$^wDb%@Opp~=$:6r`MF|h}ѐBD 3CTvCA|ϵwPr㐶|YCbf=GT3Q}ܾ 3YjN h}N.&|p yI_k gW-kI)u܁GE+$L\ ;^Gܽf?Qc"=b5*:cYRVϭR|TrZcA_m[B"[c DA꘹%IUD踣%Pb@D뢲a1Ue=+϶SƦQ2Y~,q*6LfAm$hbB#X4m3 C%FiY &2k[L)mlLEs§̍[PG K?4AoV:o ,ܓXx*Yڽ[&_tY[ߩ/5[- M> FlQ,"ZK楟k1&XoSZe\ HFTSh+977,{8*yrrK PXI֌ͺsfX.l2E U{T6xC ,-Ϸ>7ta298?})'Vx (9e`R \.[pA{cH(,ÄmOi 4Tc/d%tJ6ټMfWض$Q%QшV٭4_31kqOmD[ߠf>XJKza<MRa*J{Y$z!U*@F'W*m hck7Bfd(ɱ$U| YG !bHlT O6n}U1Mj}W#QqЈVl:uc}Ao(G{9IG~ao9a?FO}n8L,!DfѨbA^و8塋Cx+ Et҉ &g?v{Nw)~uUWW_Їǭ d|h5m":JUD.lJ)fȎ,c!Nm^0nԓӣ o_EjjC0 Eq" I6LbQdA8o idxcSI'ɽR3bb6',). W/%({лJI YYNޝHZy\0Mi3$kVK@z6cJ()L9 ft'9+`魠JB$@5ĸ`nGKqz׻nG v_Jv=Lyx>^t9: ?V-kW'g_Fմ YIҝM>록/x8'k]:6Ƙ-^]g/^_f_t.Μ?2/V^s_/}a׼֓;μ% jcK`nodS3jc3e@L^? YtrU|acJlou5Mn\:ZHmy贌8rtPzSR="oѯ8|Sn+]v¸nCuO(>29:=?Y$gE߫|+7Jb(ώYHJOO矿ǟ޼ʿ{#2Tow_MwoCU-G;L!oh[պt{f| OX>&n_IqIx 7&3 ա2:Uр3IUJ jftT`$=a]^|>a<_ ,zE|DrDfsDDac~ޗ0!$3پj&yzI{Hht5u)j8,痔4`шko<[_>olpM2v^X.j:+kNV&$nuOEa랚=TE)O൰B2>S?[ xH,!F$sDgM G!Jh3JKKl欂R. tM@r-;[2x*N(7ky&FKfqwTJ[mPCXGHiKW {_whnc#4 #~-<Z|fY?7cv4rzz6ѿ?_7/D kӼVD=ԣ9xN'ri-IK˼ΛP{{e A%HB 좉ej淊zk3W'm\X?k(5RvZX,8*_Y:#f2] Eg(5;]F kIn ?jf{l9ҨFi:7L}l`4i^g?amzg`]mGf6;-!΃MzVW849M?*zq :^Bô_Nu;`?_ Gzy<4[G0LplR =#C;Rh;t PA0>^~wN4d A{/eG$d5ځRDVxFz֚M. ==^bqqw6z#Xu+ @*wsO?O TN?jwtݿ"ϭ)wn'v~ys jXlqB:С ?8|i |ܻjJ(j:$PRK&IdOؔAR󾜒6:*M/Ҷz0K_Q%c,MTYlrc,eKkNV}iN=S]g*.o)km#G/m>d1@pn3Lv_.(ښؒG,bGܲbm9&E٬bUWu#Ac0>xpڇU`4> 6404т4T;Đz%W@jJ@&ˇ 4L6W}\nbsFk޲IKSCI _TBZ{5ۂ;݀fߏf msy ç1 A ϼ'<`|}bFcAP*i J0N3@;A2BAd*8aդ]kgH:c )*x *e^M&$7Vw/ mb&IF17rn?6` \лtnpm#yS19A e\1Q+T dPX.R G@) dWτu4&PL*ZbwR^g( Ib6 ߳g7rLbHlݔvEXKorz<9]۬|Ƴϫ;~Xqo8,?޽SFDr?WX ($!$J.A"X `#:5p}\XY<чPۈ"ono'pٞȞ>xeeϛ%+*S1iI[6ycEQ))^Kp,x_b闥xK V5|MܜM׮G꟦Ub<|`e -ī{F>HW d6de&i^[;&h+$uc\'oK^be_W(oV^5k_e:RC˻˙[me|y㹀hNN>K2e B %ETv1)$H~*I#kAOGOֹ򦤯\'^߀yd|O[RمE` 3z*!('H!AXr1"j20bOEA3ӳ{Jl7%}q%)魐P'mҎ<'Q/yY#J5!i⹌T>zͫ,x,7 B$pAsS@5I?d'eL%YwߝTrA WANNavJV "ůeU  K@#(nUX/P~ЬDhPobH0bLKHR:*r#HYB&zr(&K72nQygjF3?v$7n(A`]R2j~;Sr*Xmu:Ec;c3J1~ռMoRvK۪^ ΠD@ַ^%a-A=_&+GOElEGM/*M93$|R3M͌>YK+ Ȼ9A.h+)NT|AhDCe䚢d#vvۅmB4CW܄eҘm0ML~{Blh kc2j X"QJ%ak@D[RZ ! y_"X0),sJdu`*I"vQ fg.]FאxSbmEnb,Ȳz _d'X Q9&z0ĢUF:Y<8d箃 ┲WW{6Mr{Y}MklT$Ԣ1+c Q$.lXfkؑ̒2^㜬ZxѠMRУN5rEyk ^ CKTI/検%%C PBLE$I dC/`F41̺U%,7>lb(Ev,|=k*E'$6lꪈ7`EMCP1W7`f2D+xsQ6]ԥ_FKs&7RhX럩HghH$B;X>tgECPuOB?NG7MS8>—L(ߚwT/7UVW@n^0&?hϪ&AE#\{!=x6> /nt͐y$!PSBI!P{'bYN{a>xt7?exGGYWUI/ A1𼟹L.zsqz' i%_Jn=Lyxz?$BѾWOZ)GD4ʃԲ^jq?=qڌ~?}␷Pw+cZ-W#l]u.J޴ߝ.}FL=nMq4bnKNd: k^~=+' '07tՍ6 byLi2m`źO'G竉_9>nxUͽkXTy=Ԏ /Ng6}6Ѵؗ{r4]ب?NkA6OgrëB0P]ē<ʫx~>l_Q{W?COţ4>9d& ?yoK_o|Rj&`}뿽i[unj稗E9~zߪYMǹ!ՃZtTa%M|YC7xpN99jPQ. ْgtT^HzloJ4-KD>DjrQQ$LH\}a*&戈KvWJpˡNmM1>5>G;_GŠ!2WD%c~i} x@ՌwRr}tܵ/~EՋk7+k]^\^e-H- ZIe7qUNwve:bͱ Kiw1ojѢ˒_ƓU*ӳd/-gQadžNk78:gklMlq\ ^_=y?׍{O?~=/$cbl4QX)7y`u%NHѻtt{X5mH$kB+F!a(%5/\D@ـj!Su?I[Y&Φһ<̫k|}֝cՇ?#GqCI蝑6%(Ex}a@$wUڻy-G;_xIٿiarza6跃;1yf߬3ON7f MaVMv&@Q bCT(&J"X5V4:"r`MFT`X )m76/A2@czfFc/^ڲQIvk,ɪ%U%;I #)RL ơ:.վ5C 16оAзKTYGD, >(Fal'FQ7T9x)$G?< y&A{_O]عDtzXMh>?FFITNQM9*;5Rzv5"FEϏWm){@E;le<1L1[khmcWM=բ>hvtjཋF/wD{|u$͕Q6MڪN-hStهl`b8̜6B!6ヂƒ)!&+1؊y9)Ľ- WE;zhv8Ɩ P!m= tWd.I0YDv9e@fJsϴs>9Kj%qX?y`<aˋy׷>y9X?xɭ+Egf/;Ԧ1ᱬ'b+ż;G_rPԂL1 EFrVOXXDԃQ " "nԊEԾcR59k] }Oʰjv- y* EDB)}cr`Zж0L0cu'm 4aUgٴ\9<`\4Z8aA%][Pι[rA)!):󽫂Caxh}n~zvIsNe~% (Aۃ$#F{K'Iݤ/ - ztpxC.9O͐:`*&9bWO,6OOOa;"J`kop0 O*ԍ]%;A_,z_GfuQO3ٽ9w鐫 z}i֢ޯ$ɧYoeM%e^YQX8g97/;gpeOy&߈gLi˦w]mm|e$L0dž%}~[?/>[}E-3Wg_po?ZLiWڣưwmozƣS#qD-">-">- *`#p*iii~TϨ`˦?090?lfCrĽj qPnʄis? VK}2ʓ)E! 4@;K%r52I~Hyj3}$ Oἶw-Ju'e6-sk982mTNŽ1x { t9Y)#Syb3GmꅆSB]VѫYۑn>Wޗl72خ`k>ޣQ埧/ld+L뙘NTmTR^WVZ#jaWzƎ`?,iǟ<@rd iB0QiY5ھ*?-{u6MԔ [oɆfu92M7>Ɩq 3BOw }_`ÖlS^sc}V3rQ_r6jk Y2ژ3Yk f)$qº 5.X3fw#C'Cی&R4ڳ; \n 4^6|bwaܤ0uPə\u%:)%LQVC`A`|#rDv>bwܺ)~NWg'"oÃqg|g@-!w[:NjOˡ0+X6ܳIU`S$76uelMםW3 ޢ#݈Kcɭvrv%D[6/9Z֥zPZil*AlVp-aФ G5Ydvi\B8'k_"χ9hqo<ϸ89;qkHժ7襥d| ̑')h!׉j."7z,xCѧ7O};${>S* U/8랣3|g`π~|2^T+i n6"K9ƷO hi>?꾧T%>T.-QOlO)Ue|j#v滜adSmJ)٤ة.[6v?dbՈ]¬I]E#IXWSb3m0f.as\>$^?^~j"ӛ}`bMKߛXz֐ֻKL|EtUgS- Tz(jU^ ڙ|O) BZ:50]1ou׽Oxyw(lẓjϠ4E9uFs(; 5ŬʼnCkcj&Ъ_} 1Cԥj*#Ebd\0XFvM.W3'0M![rRNEYWouzRņͥwJ[ٶbs xM.k/]k+zW%.^,-Di`/Dɇ ,Fi۬dV B1ԕǢSыhlh(& l|He Rjjrt[aYq5;1y8~^砗wх7ф1v@cJ{;T懯zK't6:+6,3^ h܃Q*xZ {ǗsQfbOдbv85Cp#GyG{JѨ@D1djCoB+ŖȧPmJ"eVc貚wI_i.Q208%E!@/{ ~ȒWg{~%=MԘWL6ͪ_WUWU%jb[SSS~*S3t {kb~hǿ._/=.`.V mBq:9HIq(CErY#TcFxI.Rg:EKbLx- UĶ?9%I=+|Qd|l;wibyf,Ʋ4Ԭ-M 1SE%x.Cix`@ą_~|X6E)wЂ2$B J{qS "FГ4`q*H-eJxwhNoIˤN}Y/NǛi8ljZU~̦;pMq]rU,b`?40d=)'G} ,jAq|}?><׫i_)<[ghI9wSRm g7gzI <$=cQM9e9vg[3 JP\Xdp ^kfE4 8; )/gեb -@ L1 YܫEϽpm;"E"RHxb 'Any`NE+=V)כ[4yc0r~[0jx0*hTo4T\!˓L. \ej:\e*5p EɆ"s)Gx Mltc'y"ϖ}9nYϋLqsB ~{[ N?3u?3z|ɕTԽL<jq*pjJWʆCW)y 'L \ejt2FpJhA:!B)'W\v2p73׮#\IC8e'WH0pz2pT*S;G**#\)NIB+_W\iNZFT*SW(c 6Hu2p=W;RKjTr;+Õd@E-`Tr lcsG-a0c>\>N$8Jޱ-}L@aky2pL-]L!\1y8";=Ծp4WǧMfYUԵ>]}pmefKY|k^~X‹ZQ9t~=IKL煱ȸQKЂ WUs|3r`Ga++g /?b:iʩ)l(Ys-޹* 6RqCFHVD4~eHR xm:) UDnz`o=|WPg7g?/1d]B17gمܙ|hЖKSQ%c WOqMu p<ո )z}>s60}wZthst@]xq9VF- }֬Y.Kt>i"*uE_ LpT6ǔvN ZQ$= q9: E_Hu(u­9J>&ctUzj\qUWbTꬓ/'>+U:^sf&f0j 2"HsJLǜ{.`j*d/k67GOA#@H5:C g eIahw)J(|:iĥG UD 2# nQB}3),CsT9* 4ZcT&/wjbwOVaMݢ~AM?mS͛C,rqZb A(1VXjYJD0:p︓8) #>CFkCZ>k Ts"$䓽5DD"[&k(j2ByZA[?|?jCS1|c8yJpcSAUh#sDZ8K aERߞwD}%U 2ц␕]DmٍnalI_jij׍w7r?W `a-GWqkb]`ԮڽB7*kB"\EgU$ 1M_:OYuݭ}vr}iUl24] <+Q^p]5;8I'i6r,D,+s0*N{pοx ) kY&1l F8v: E*udZ9bZ'S-##kW5:[A2@6|5ƝįfXӊa.S"{ۦivnv&@ctyqxq;L][>i>xevCN_ޖx&nSLNji21%:/LBm5| E=hwl6-wyˍt[ _$?n<-xeV{3l޺ǬVK6eE/'vˋVlªy]_CDac̭,4 /~.k4 cIN&+U\+t1^ܞ_0}{Yuq1;\,K/9_(9?r%eh1)r8N͊r/~e7N';NNSU36,k]0܀taNn\j+]{Ѹ\oշPQ)L6L& 0ҪGU)2µLaA)|ԌPQ9+T04 M$vIxQoh1vLjG#u o<Ո':E D6i#1HZVq[#g dJUky{Mr \ЯXOWZaug LSbz}U)U4T3}:\ΗTj{58Ph;іyZC t6>P+T}!hZ0"X " -ZDi_){jJ֣c^,#k߲$22 J \YXy Qh/)EfFIhղDZ5  4q0?4MD.e lʻuhkD(ReUm׶\Ujrh=bp۵8sRFﶵ0)N1ZAu,$FqQ| 64#TI@`D)"M<#?(,\IFZ?uL,xF5ҕ9;mR6|OꩀÏel0Uy*wq05 >_`WX8 >28}nȴEO|Kv|(OMqؾkQɼȒ~_ ж =bkZoiis˖40-(W\DQ@ԱDC}21F^zUjDxEK~8kkNXc)D"KJDJmd@H~8.e 빱1$I@e$gSG'?LIExj;5rv{%G3bm,k;>/DFLl}q_U̷Q1hߙwu3?W0ЂzHeˊBW/ߩz{ZTyמ+I=|" ކ䃥Ni'600@ꥊ(Ebh:MB (w[ %QMnpcٲz9{A.Uz~v} 2M\} IzUǎ>\/;W]/>ONz Jڲ8cYڝuW!ǧm/ﲪ[-֌ٯ|Td66zkFݤUύa>mouww4f{Yw`l( …vj; ٧hNW!M˭6g4lgMdMdrdMdj_5ϚxYA]*iu ׋%O?ըAE(TBK Lh%8%q8&{G+8mvB*c Y)nNy(bHXJ\1O:ƃRT){' xNzcpe%)Ĩ5ɞ>5 Zm79}~rSⒽhj#d){i6#o~+Wl1u{K˥䠼H{JQKmb|f5 IALIHv+GJ7BvӥIh&6FX>0wxpV[8QȄQh M-JkGIAqZMiP{ "mʃUM|yCBM*iűZ#ASTEǜf*I /Ldm2,D:#v P LNk- )U̵?Ж8k< |k_]]qo@vEgY>e #Ysqjи~8= Fqlb 3,h,ePdTuT]ZίԠw0)/s2,9s+LGԤ*b4w%U) dP;Q5Oe2,g*T8;;nK]FdZ*^T&ޡf ,e%Ar^XV$!V0. 6;Cha;<|k1AlQ 'Q2`J>N-RFZ_[JL9 n\2R2fAA LϔTKyR+$.֚-/ de&BJ$,M^eRBv.Q'x}O_yv&·gkeCq‚ -Jh`Y,222A!F[ֆvO9tl/b3VIh-!F親0"XKڑē#EHﯨ=TyF, 4jvedBT0;+mQz se'*N+9$9w9RkMgV|iʬ#r焏pHlhs^2c uH*VКVV:]3GL*ĈeK[BA"ra !D7\:.u}qxk 'Ҥ韅 ŷk] [ NJ2IOos74R5㔻NGd0^^?Nm5-aϸH#n(&8%;: C8rqOG QzvmQ,9Y% %R9B!u 2nXaOzqYN JIsvb }hSҧ5 o5(trr<<^HQ+(lt59g]CyTi\tmuEy(^Ө7mҧZ_?W~h.|{59}j}&L z!̅~|n^%H5~* >8{EoriƷ(ֶ.5#6wthy!i0l`ye/z|f7shqm꼓Z]W΅nu\i#aI(G_S5Uc~ī1C5P4@<moxpKNeLA+bZ7TL**OL/WwyNJׯu߾~sʅ9}߯N-q@K:Y"]鿽y 47o ߴfg/|vlkfY- #ޖ Gf% 퓾.;P#+gBVăl$nWn# 2Wi-5уZe:r.xu/,Fzjg\F{'@;^;)kDec%EFgga6&GM3@v{:r_Z[0-OmN]vvם^^U]DZ۹uPo~?_0%B6L/GA$+c@-VG4S=k?A'%MBL1:Kaʰ3sHbʖ-'qQ"zrjM te\bDv]-ӤĹ&&}=?aA6/F%c|ԑcOt5)s.xVKB@k&0G!wWwIy624nRꥊiBzHdʧ@XY0^g^Lͣ)rC'xzm:uդZF>}ɋ<އ-VE+A 3I/d5}ʯy_;`<jzˏtIV`{on-N U4OëBbHJȌ'׋~ f(Kֻn\>x6k0XU cF|5ˮT ̣08aS9aS;a9aP*B1-26',2 k-`AKe!nMX'%.Ų5uf*cVXKX[cL{M 4ܶך8q7Qh -Ssҝ43/']qJ=}~p6Ff[s"YjdIG>{a؇FPBdI@貟[ CY$Q#X[-';%if{tL‚Myu5h֨t8F4q G`RpStd؎ڵR{Ce61x9 Qп6*f"P h:&kYC鬄pз,SumDӺۏ T1P< W15q/GUmz woƗ>x7~ )zug6Yz3=3C$@0ńZ =ji]qy߹v(ٸJ<"iNfGL1>Y J愥E8Œ%:S}j:\vjgPKkRIJS!:x1&霹 Ѡe\IhȱY;3 `\x^р8yT6ֈY+^2`P=@+Pq(qEڥR{PԪs 4H. ,eUپU *٤l \iaHh; ʂ:VT\T]Z VLH8!OF{M5NFzdwdyW'{dV\FL ?&q ;`"EJL0,W^*j?. ʞAs^q1[967kw>z-fa,}ՙuQg%:lEfÁq`*I 6=TwKm?W/|_ A^nа(k.Qnc!&; 2 ? ,'Pj5:jbKc {Agh$0?L\ä}.3Ǔ}Ec;fۡ稵"ƒ"WEZW$SJqHl(E١UV}";z"^aS!C4Gxfqե˴/)Wf<[Bf8AX@.:0(4R,R]WV:fG.\q do> Oeo=iCK=s= toSZ/LWsGWQpE3$d ɚB0%E+*:Z拂R;zɨwF*WMW_{/;F5F]v.Qݨne7uٍ?ex> #tO.NW_ׯ(m=k#އ?`?&P;Èjo91UD:sZ,Nd>:_ٔ4 -}ڒHKXTΙXjay*uJiJ#W(+%m) Ì7ܦ=r#&.2j4Gw^i/Tİ5q6,=x'dU)$D*{L~_k)j'%۬HSI=w(BfCD"KYI wmm-'UypʦNś9.cr(_3C IѠDyǵ+h`?t}a@xn)*2,% E.h[y#8ŸZ1DpJ j1rv j<_4Z-&nQT] mGL e8)1) 5Ra9Bg. fQbEQjBHΑ+S=@=kB2N AR-lB*ٍtbXX3N ̀G5Eynj6ivWoj6t:2%Bb,N r]7&ʸ2+{y\lwqa[/IvDDTxKF"g7b,lŸ@,jw =h}6N94g4j:e'BG#e'5MhBds* xT;-(!= &D%L{#b~`]H"FuTbƨ_x26})8iaD"cnNpx?">SJ{?D#Q\jc(-ZUFmset ̈́RH#…CଶqTƌ]eh jo1rR\ y$CfJ&-WMsiz["}w1}ZhqYwiڧ ,6{C"@{1.3QB%yj798fߏ.,G9Nn5< -Җ;?t' atz>s8 | C%vvEΕrf:^YDf&! z>!.uƏx3 4-տzfo~TQ tg~V8Sq^>q=Joը6ٌ03m?'?5^//m*fQ6SC̥:][nnvpd4~^6S!wgBzFzGH}ða`˄6= myOy.>MoV ]ߎ9,L QWo2q;jJFɹ|]<2lۃG;5@Ѩ[?~v:}cYݺ4KNm(3,+#\r]w]4TU薩JS#?@!o?L|w?{YF&7o!˙ ۻZCq\%%g-NKMg524ه|Qq_?h.^M}MJܑ;1![ēl$z3wn#1D@}[V)chrR"Ko1`#=a \Fz~; x@;INYiJYtQ{U1K; hGKY&"ra)rSթ806K80u)ЛarνIg?GԪmev*y)G?>/~pIUH]*VPp΢a]-[senUϝU=[V.cP4'k\RI0ux!qD@u`J' P!rI+PMqW@p&e6Gb% ҙ[[MBL\O`yqMTz?U58{<4|nO}M<_BGF.48ރю VH YSAxo}n6TOv&;η0xTwƬ٢ 7K S&XWS)+k=vqob;ą1F8"|.O\[e7B~}ϼ֣/zhoFXW];lSvT®QFǜ-XD; DЄKQih6Gg}e z͍ht۸ w߼ki62bʴ?qWu,^;Y1yZt `0wH$3k,=d՝eWxk޸kYeA-ۦ)?X/d*nĞp'^߻c 7C\\MSM%wU*$TF ̊hX1hz߇ww&H\kJ=!&jiVo$Ѩ\H#U!2ov6]}2=9jcsĿu+TRRŏS (-D b$*̓ pvP~h'WX1헢'aSAe"${O$wې|)E600ƔuxJ-vlp*!'S {7}fgV%mvG.z?ƃYuϐ ߧIn[WmrO5qW[Vrc|n-oߣ獖a1ootSϮ/R];iߘ5kjotg#Ww(FΞ' g}ui6S:8J>X R#?^-WkW&9wUPѦΐ \iL`d2{.`j*GA0 vѣ @D y!2$w0mu,F\p<qQ^~k~ތbyz?Eybt_-8-Q1 r+,)R5O4 ;i0<{|Bπ<XAXkP!!g[:%$oD^Ɍ;T)ξsZ\sE(t7GiD{*OdB?̬E:rk w."U1RqCFHVD4&E)㙟U PgBg<{HBh4-D-=3ӫ K\UtoFE?b~9'Yvrje_}:id4n+E=_[ T=N6y*ȴ3O]{(nQܒV7$FbC/SYVaXx: X=+H!Br~b46?ϩ~,wKdyq]:jZ0M2fivn\W&#3ju_ dz ʤtdmGv2(;?Xrz<-q>ㅽ|8kx9ۢKBu=܋.OSQU82 P;.r䙮j1܉6k$u,+eug&5Cu&]tP6OfXUO杲Gr&pqf66h} _A >H P`+.&UDZH|d ?kZE Fקu#uSxN@Lˠli+ O2v&Vhi998|jQI^@Ϋ*X7,0TP'U8F/m `TL` #Ofܼo+5u@c"a2)\%HVUZjW{v>d6][oǎ+?`R]b?ŞF]Xbݢ8˚44Ԓf6[fW_}dBM,c8=+=[BZoiEE3HN-jS5mꚸඥM]kRz3{m@^<ғ.#_'$9w}4-j.]r꡺ף,>{~ȍ>,cRiQkp&z`1ZDM>bY^='%Y~HH U5S'B©\ReP6fK 뽈Oܞy㸿Fp+\b+̽3m $`FB(TYkS18Hvmm:֒ m5HW=>mhn %j7~Fߏηd;0bZ93r2v-|]ه##*s畇s Ll ꕖ7}A'Q!OU!)geK` Y"Aob8.JX1*hW:FetjR"ߐ#ekI̽:'@oEl7'7s>8gSLgZ_d$|Z) .9MK?}UNY D+ӥU0ѫ{"zOĬ3r rA|1SiV&ˉ2:K\jS%;V~\gUCUV$7L.X\$!!Aйh%tWj)3_M|1]enW Xmfy2ay٪z=E=;۩^u 6읏Hti}jtP T/Z'j9ʵu-͉tdƴ"bl!C>$"~^ !PеUg[M5ܦ l9hOkMb`j2Y`@9:zFW*B6 *9.c4zT.)k_c֚8 ;!(o<k0\bŢ)+ h|]M>Ƀmi1'7lTi!*2Y'&694dJVL5㠵 !9jbWF1U{}/qV[<Ӌ :IV[SL<{|!ikH`[J4i Mʡȫ,j>l \FM&^!\0n U7l iQUr`pz{`AZmj-pդ}jpդD7+`kbp[LLYX$N7}u,F );7%F"ٺI>6鰇VW~s{a`loh[`I648LӞ@]< ,p򫥵ꫥDApդZpWDakઉ+Ώ&-M&\B24GvYY<7ߴwMNxwYL~.j8t`dۅccY>KwvohKJhha4lF~񻽅R 7-gJKTM3Ę ܬ ;!0y $tzzv5S]޽od[o޾y Wؿ}}r#E{g6zZ.n' 01m?(ȜSb ,G6r@99AܑJä~`/&w2}h`?UZaض]7kގ[;~yss,®7B?b_I~Nki5Iwk's .3c/!v)Pp l5!M\ BEZ67) 5i!7Rw<}:î`urKieH~e<4D{نhq.h=)CTxyfTKh0ж~:u&eaU]Z&hY„Īdkjs߅{g9 Fh'&'k+pzo_|ܻ_|^GY 7ͺ@]xJ[]4wHƅ;8:B!mQ.d:UcU0x5jsy5ṽl^ 2NE+.yE:Ut*!LB!gS^2_&10}n,y+xT9;;jzwOy!L呜ޝӉs;QOi{y)s**+,N\~^gGM9$LYڇ]U[ xĆ]ќSH9_wsdv\h`m*%Z5`c(31M5+= Dg)vxrxJsL.OPRY-աE-4l oPEvb5dT83C#!cz/T’=jo6T?xŸI?{lq֥6ruϧ'Yko|vij'yԕfO.?._=QɋIއ|jWέp:xk !)rՈ-XP~F qw9%kJK5\b4QlZJMIy2 RQ#)"qq٨5BR}#co 7a'+ڳ]\6fdwzr⺺8ʙ= xttGl<0qYgi1҉֣WscRZlCɭdQ6%cMMqo䠐 vʇj LՈY,ڽiQU=gs>QwZ'YBѲ6!Wc:x3&h3U +VtG ! 3 ^tL5UvN`T,*SnЛ8ps0nOߗ~<uψD>mI5x1&R"ZbwZOʈj}i&|P+"J9i[Lm8JX;M}I IC6)ݲ7qH=`yIpq͌kbiCqf.FNuu"bv5xPJ|He7{ӎ j5<2ܟlF1mG: n~|Gg 87fF|{nZIג5lZ-uv4l )S mvJ{ڔ@A)~ߙ{N-I}PKa_*|1 &mс\]ƘSBLfJJ&?)JfyD:2kcT@2p"R'kE"d,rPеUfnPFyD;ꅪ.Mɔ"#?><ϼjm^&C$[%f=Ktr&XM |6o((Vq*ru@Gdt !\W؜Lr]]>/&+OZW!džm@d5D^ȭrNp؉# d!DDq'e$' _Y530d,*qi8-rJZ`&Jomiy]21b5UѤA%阜5K璑i&P _Cz4L_!`s@ݪ \5F"hK;(4~TʅRz~כOGj曮bF?韫8iG-']OzSe%LJߎ'o~韣xRM~,Ӯ:YcÎx4j;XQ<<*V ~?v|R&k;8- V*svYmމ1; > _G9e8En[(9[\;V'2nìևA|r1-~Nc22BLS,N@dr䵽6 ^.ʌZ{q4k]%GiyeQ3ͮM/o5uEbvKRJJI=:? ce~W{'@%wIwW /7Uq6nK\*J)KRT53|4E$ʞD9 1~ѷ.ښa+\!ƕ}̣wh٪`s5w쭭2զmR&V{lHnX0V/g"czD`JPԙ◆ϗ3>7o]s|%' 6x|1O/xR[*QS-8-~09E4_}ͷo_9L}/8/q:jA C^{jknZreߡ]%rK?\o7#e"T&DliGA%!UmIyG6l6Ľ|$zq'QnZ)KII {/-s s ԇ q)!84/䔕4E1Z'YQH@o8"]fSPN'tfq~99tb]0)c7;kq<>lyv?Z0g# -(l"_|:yF +NBi "V9=X9X2[e] Vk>vkT԰$ur)\ІH9,Aÿ3&H>'Rdi׊vdPPn4XH`u>hJ@ WÎClPkPl1Qwd}]F7oJ{HL() b$DHŕ0fi2t_##!z4{;cIGR <SP9AZ)<1ΠG(@TԽ erNB%(atyk"FLqMD$4BhS8 =z]n|qr~m}Z<ʰEx~(i=lN;|>g :za4;&$a p}\DQBԱDC}21F}t$ppw;#{.YPL+RE$Axd<{0۝<IYVUf8_)=|" ކ䃥Ni?c9j5xϙ :3;wݫ.uJȭCˉ@h{չzvb>eSK,]?.W9Q)nhVbK6iwvިyUЇj^+]W}EwXuwTlX$I'x-ߖt)۫2f'ͮz7ϒz|sMu?V1ZqQ`xIZJ\4g"NM}a3kN<Y7wj{KzQ:^tZ30j 2Tr`Y=t^y0`5b.j0p%RKHc5A)3892\w*GQF8w#ә8w [qQ^ҏ3D|X]KsXvqeN>gl(cg_qZb A(1VXjS&#O4 ;i0r'#@oxv4<k x9rB4m (Q$y$%.5Btby2\~ 0l9^JpcSAU#s$ Lƙ`\OU T_ݭ$:E_-MhѲfw] ^kn ]ȎRFL|>R;[zmh2"Mlp6;8MФ*ٱN r8Ogvln7 X8ܻG{oP&v-Bl&ݺ0#;ߡbQdXcX9""PC|] =,za0W9 o%`~~be]E0B8VUHnmln&`vKI/6ö ]6&rmqKE*mdL]Q>\|~θJr܌_&xI;/}_\EZ~owsJfORIyBZ}N:?19A\w,Gic n?6 >H%_trʰLfmf 5/ҡuΦ{NkY.}k9"o<;}(Sr ,z%ŤJCR$GtCŵ ;y:;Oώ: aӄe'N[pd-*jg"xkȨӚÕO-.p27z>EڙnI~g Їz).9J9}‹Ix!TP`CɋarR:WRӞ+dD0M:SaSğg|ymbEj"a2)T%HV+8<ZjFR%KjD +Ƣ<[O<'@9 nILK:%cO^02^ *T=I1.5.':_F0c3qvZC:IYgu#Ȑc,^mo̞/Hvc/0+Ƈ3Lq̱MJLK!˕ j]Qm$"*Nb] }T^Q{Fdd$AbON%'2Rd*mdNb!k֔z.hCL$]T/ē&H|2 (Ӯc֙8;@k_9ܕ jrw?qV efcM1d NEwܬw.nu}s3#0=: {6!F_$ܷYa dHDJι+)2)xП=xQڗx~S'sw( O~=vnhyQ2''y,VVR.6T\Q1ryEBb5b8ȕ^?iܼ}F- vxB$=zZǍm}{o%Q(6 us3!-Es岮r PxPtK-](03/ف03v>cY;PͫSBN~I;ƐbD%u4~Qm\O (hsң%2\@I 1jMº\]`A u&݇j&Afcӵ,cOCZ~9mGn8oob _~z\\J܌ΨV(h<4'%F2RǍPh"sr+̈́RHIdq2Nj"JtiQ4(CS0$@k3qv$ Ar8+Պh .NKyAD]mRyukWAi(~P6& 6gčjAWvd=*:4SNq1ʹJ&bI3tXR*(gsb~θ6*nZ!B@y)dP$ɂws_cƦ־3&ǟ|0?|@äEYnqidv= 笋$+C%$JJ.}ׯXOpEIh #Jq6tLU*51G,&:A<%~>!E=YfVέ)w5 󡴗yW2?{Wȑ vIyD^2]{yBIT67(QGXH wKJfEfFD}`.VцDeDsY$`y8 @k CErY#TcFxI.Rg:EKbLx- UĶ~-rK8t<&#׫,Ui)Jr]YcYJ"P=LaSOOPz"ڪ>T.{Cz""Zhz@ R䚃)*N]]e*eW5+M2JBr%2LﻺT+TW +$X1s0* 2rURN]Fu.nˡ**pɃ0Y(϶^Tǽmp#}EE N8,) ٗj}~Kan0r3{Z=%0*͞9,9Yr=ze~0*+PUV}WWJ:u g``U&ءL"2t5+RPp|uNg-bIQN;I-ڭ&w&i)qP0MȸQK~r:s-#ͥggW)QG, ɵT "fJiYy.Z٨C`#*{ާx\:ZFKEJQUbο')q_qJ=Jf: |ӖN{ ;ޘӜܽe[<ck&qãKO[FHń 3Hx[cMuEuXx%βQF/|BIDL` /D?jud i|&i ,}rO=h^my " YQhTUv}v.=k S]{"$@Nx:9͍ jV᥊(Ebh:MB (w[ %QMnpl^FΆ+h>Zi5 7ٲWjx1Y Udzũ?Lxt|y(2k[hsf\?fL]?7zebG8\]TWO9jjke8pwG8uiWm`楑!7loquu7(m֒:ꚉvEwlXcCR]8XicH5U7)YR7{g-BpVGmͮ<{&PI8WǔvN ZQ$ċ54hUx1q]6 &eF2 ,Z#ՁHO=gb/&Ci\q.|M̐'Ÿը}qpdAޡO羽t-;ZӚ0-S e:*Fg\1zTN#D#d^ٸS bxŠ]()HpP#RkYBp;B6DGHQRG1n▒H$.}b" J!AL PU #qo;F5r6Usu~ 9iv#4C$7t Bewe}}jX~;N'VEogE0.q4:F?/uKT2c?;+??Liaei0A(%njc87f8Oqs[SK&h0\{9sA}MM2""ܺ)40()ys}{9-;_@}/!sß_+ X6wYC%?e?ǡwZ q[%rf8}ETQ>,Coq^ݼFˡw՘?rϤ :bφLx8_&ʤzʚkG!7%˛gGF;Tۻfbܯ:7(jt/4y/#Uq$XjBED8X?D |ZPC1k0$H z쨾C+[}G؎,<(WE{{< :hF{.%1)#,uF딌<ke&PV; TxF}c\k\ "8Ot;F0e5r6%*>y"daY.ujdȁU٪7fOmOZvZƾ `ˆ/@Lg^?y-uz+F'ɔ <4-G.fŊRrPH{JQKmb\ᤠH$$:nJ6E'ҡ1P)c9wxpV9Z!2aB)eYi +P݅ 쏤8PAKNKԩyAD^mRy4޹R*6K ڄPY> ՂH Jvd=*:4SNq1ʹJ&bI3b1לEqRx"ڨ%:s IH'r$Id1crNHƦ縷}E~EګFh߯ me gP|&w`.]Ҡ'>᜵Ve1: d8>xtJv_sxiAIh #Jq6tL9AEi#E v v5/yu.25*P6Vc>r~^W>$Ů7g_?D hhZD%fur14(0Q&\ igRi6r:᭶&H]Ҟ-U1%(TY/ ǹ`j2|uVuߦ5e*q}tW]TJ| &cJzI" L`R w%F!:}QYb "# 4ΑZZueMkDC۲,7h/q>ˀ+BZ! $M@ˑ0 H1,L'mv$gZE iô:iC`/<"!*.hX'ܧc_(Y/|ZͷYY"Bb&P  Yb%`@GNP<:GMyN*EN<:,.@#訑̅|Zm@eq:EL5s6ĩU*!ZNHk i&%BO6ޒG)Bdt Ca dD5?h>i}7VǦJ>>ŮL󑞬~kv53} N3In7k,ep`DCj/ڸ(N#(p1Gj0竄QB9a 28ӊsf܍9AAlQ$ZR'ƘO7RT?;eN=^יspFór۵H3{Fk dKK?ؒ[: ڌg@&f>O'j9ѳ6= .mcVgV'.u$7,Q|:L4IM6't@I_̳pm(:W{-QWp_Yveږ\pTMXʂ?#xE׫UNoBIp{T?|o.ջ޽;̻ypr-Eye7}]zNeUa8ɤ q/|'n|Bg3:FҰOf.P*RGKǬZ;B3ITˌ᭰ĵ:֒d3d'N@M<$ ON%aTB3E,Z(TP0A$q22"jk{Л㸔q.`cN' UI$%Q0Ĝ?JJqź;m)Cw %bGhp =a6$t1 ͛`>Sf/GIWkCQ>gaRA+iB5$iަHdPAAUXKqtx QX[D W)%90auXmdRbLJI0霎ƨ21P18A ITHY"!Ǝ@mrvLv ug{M16 :SR/!JL`aN>PĤ$I$[j ˢDA4|*Ԁ>d8[-GPI4iM/ź[Qٴ-E Ƕ5:*brr`WR`|z]FrM?EwyQ\|M9:-E{ϿUC]&NEJ `AFI)2;Yȕq6iduhV2*S91!q'ЏF`c0) 3̇R)lB#U2,|bd-j z6̞î@d=/Z6O~@kGol[lEBD͔'Yr:.:ЖJ eEEPwNCl`;8f[S"hA'툁DtxmJFn&b.&ZmIZծ춘Mc3&e'D#e& PFɽN=LRe.e#=C*0VIM@*haZ@-8mN&ٰSЏi^A6I8̀q7d4IaLoc(s- -Px-{Aˡ A{7,vx:j>ϐ0fON\fS:@+޷7uS]--d? E٧dO@KwiZɥyzoؽֻ)nn{j:?M/~k簖 c_G ǝi9XP{fH gBvKPɳ8obн ahƸ ?CVIt#\J](k.oknc3~c8FIsmbs(EaVF`vKul3-Q|z1JkOۖCǵ)y5qݚ &t:[o;6{(j~ɋۿY;TC^H!|򀷽䍁So{dypg}|td( ǧͿK _ZxFfnE+J{P~ܛD@"8yKL-{O{ת"#ޗL%teP!]Zt}.r#s7)鷓ƛf߉_R.p֥;_Vf|-oQƉ:mt1N;fנc2}lrqfYn/_:{6iʟWx^Q.pW z) 1]óBiWڞ!M\nrOfWFkW)T .W+E"W_+[9@R]!kTS!+dv̮2Bf?GPFDn; !75ɍro?vrK*3$7RR0-+E"W_+Eῢ^U_+E"Wj+EZĊ_+Ev"W_+E_LՏKj'7m݇=ZG:2Vdi&Lۋ$. <[$P* ,f0R{PrLI\" L`R t߸,1 ir.4h 8W(4LjMm18q#>Z8t@.P-֝=W*9`?d[Rp{-ˏw{{ۛ3xGr49;ٲ,7_.qYr/\] ap$i' *dIc^ !|iV:ˌuk_ݹ[EzVit#0-NPaE2 F\FEE ww=}>v{ ,%* eG=ZT3ZeC,::vh|%9Jm{vV,?*O5Hn $0I[\KQ1 9 hBWE*H@ki$%[@  >Qn-yr)5 9nkץ>gC泬.ߜ6ykgM)v|'ZTJ@m',8wDOr3O?iqE'?5NShVMEw9>x qYd|F @ 'drdp] fs48voH*y@DgcB!Gg.n+ץ,gCvjCcq~S'Usa;#=-wuVd~~q#Y@OYw)lr;Cr/i4z\Q2}wF?{۶~P~4]^Mz>m52Jr4~MI)[ $L̯wҚyrƫEd$B g97W"{gճzl#Y1(~HB8} 7{Pj28RBiaH0l͂0ŠOw:EK1{ɚ1G%h"FmZ+ƄG'{y$ ́|18Il *깠(U [J/^إ8/[sYP$Ud*G7ܴrV͊f(j ;'o^Mɇ{ͻL_{ `_&U)(})z pz \~hƫmZP檧|q m-a4ٸ/P8H~0(ozOZ;R>Z&ne#-&,l$B-RÝQRT5g TC KOK0̄ﯽtMyg#=.y=á{ЎFfV$ؠ ȴaĠ~;+Nq<7U]h*0Yy ?g+䮆= N7+ =,i<;[eor]6% t,R;V͚]neZ)41T)b+t@ OjRzʓ S)cmnvdITޤR«VwY/]Bz, Yz.BNK>A-r)໫zW{[\ KM>NwMSk7Ytn-YrL9Ԃ*VmԆ" NڣSG19 2(,.w:?;'K*d0:XK.QQjH6+@"(i:gy@)"4ȩac^zz˕di 8skDJ8-uMovk$Ż۪'S7P6l]%Ҷ޷#}Ԃh帕_Fq,Ɏ6j,ɹߴasޭ6aqZvd:(JغTwnH ![ޢNA DM٥K \qXt-8x ,f1grHx&V9۞ۭ%= IUH zƌƁZ1b"iZ+Ipt5R3p辷XuTI1?uU4n[u4࿋$m:Sj{xxC6Yv-shYl{y͝O-z^hF׼ oԺĞ7t E6ȄU)A2#zumK-ɗ<(s'> FiuGr%* o6ox^,9iR`d>g\@ 8,j뽳=WbVaۨ]y3h3AR©PJÆH 17`-Sqz&#z!Uxd!R&R/5eDTFQ4`pH%H"ݲglYhp&DȉEj{V{m#sa $鸾=KH׽aYUVhCqHЦIJ)j۲C|dKRB<1.4왋q; 3*ՊGq9 2w7"P Jk\4ikϣwo?^0T]AG,p6I+9^OaR/Q/ãi DG tG7 9upytnUt㣯l r`0:=+ѻCK؂'`WT4 [\Z|r~N޿* pť9^UŦF^q2: E,zxV际~A K,EJ^JAJْ?e'Uh0A?.L>N.{\Y$Bzm7\(Fvf;N5ְà tPJ!pFe~ȾrQ&PP6HbIf ~e?]JBy׃hppB4,~~5Y7?3OXM1I%Dq߁ϮWvZbuLl CjxTJ/W_/5Y2\3] EEړ2/ iu۠s_} J;ZfޔoMEL1W e n8աE%m~_b8_ΉuO}+M'2]IAz1^{2p< x__{\ο$WȒף|nWߣꁔAF ն4mvZ?\ty=)Bo31idg ?Œq$[]{㧿exv9y0cPv<!uy.;^iMQTl{OxJL:}mE׬3hLYN/[]ܚvu,imx9:/]8'Pp3O^i;/h9 NrfCT ͣQ>Qe'(%;oL+N~EqZzB <-YeDFw"dPVJ$V),p>6(r]'Zz]< 5ڸ#bS)آ^ IJ)_?˚~i@v5] {觽d*L.|F4J%n|Xj Yvb,^oTa/׹ZTϕAi]V`ek%WG_UX`ԛR^ȹ%s~:-hT6J!۠LnL ) sNNeٴU5$q!/E 3ᢕ J +r"pc KVU5ЅJa^iGjd>>@SJx-dmYUkm:[j+i2}x)늊\kdcE y{2gd|8v#iGb:x1NDQ KMj 7?a1jdEExlBE&tgx.>N$6=P뤎(ڜ-@1O/L0,2!;[\Swߙ ߽{AD^?l;GaOOq(z2`|i=04[`|0Kp>n=h !BgsFM#4hHk 6eeJjqٟr9અNwDKBO$r86vځGH'2!$^(HKzc>SIeJaZ(6\+g} ?@cRGCyd Whk]&=G#ø߳f E*H&IuJ8JdK (cl8¡? 9$a@+6^uLc1>%rxMt6렵YϚDxfsɉ 0.= Pob "v5z+t'*ei8VPK"Oos+8F`K/⑨np0QB؄ I Qc(,<EkY˼tJ\WA:NNv*̘WZx l|TK\ ,-0L0B:fU̦SM|B|oPݚbGܫ* XEdA )d4&jcihfyg* #I%;Nv$5H%9ΙۗGV\Z"RqF%پj~Yڋ &bCIJ5Π8@]lXcT/ +5Ev$׿BGf}̣w7"zو}}qh۷2ϩ[ut'Wf Բֲ3k5a}ؖW>ṭ6&< /] OεL*}P:8/-z*[fUEgL~/l8^l7 KYDZS~Ƿ+_Cȇ[]=E%2}>5rEYNR -rI!Ş=zK{c80zQiK~gq|qvvH{;ifT X1b0b,n2z^Zv?; 7?nw}:R{e]៣?];D OڠG:gLxǎ/O?l(qQP̵P6 ^FAҫQl4{vyd5{t$/f.Y^h_Ӭu+teFznke4]`+1ԡ+מ%#]}38\8t?t%F}( zJҶJWQ7 G?&8Y =Ы]ٶx)]%]Lw-'2NІ Fe/1Yص-Ez+Wt22oӭMhu3Qky_nraѸazL*DRcKtJ}[6ΆެdvRd$eKS'WIܢ4yu,z~v|zuyz 4dGP bL)k!v0O\ Hei{VeiHqyvҮXKӚ}N3Q/qPvT p8i9bɅL- Llm`i 4pVj<ЊR4D)f0{L@Qi6}5{Po# y]teiLΘ)Ow-HomGPk8u@q8u:`g3X:FGF{@`f=ܡFPb7P.Xh n}ԊI5D:]'9^tf_%vJ}G4"L( ;M^i-I5[*Zy*3n`Mp>sΐ!ovr}r D \$uk?=&aփ]6+#ȁaΑB%7َ[2z(Z"O^jv9J$Vqw!iWtEsuTF7{d{^Ǘ(@~L[uCOktm.%gzEZn*T2ؒ([c`gYjIfUkPbJU6SՔjM{U]*;札X4z/wpԃR Z@nIk}k5'r*ckds-茨6hFk۫N=hS|6nZ6-=5ǏRR̥fcהּV`fSӔR=FIeыj{c&Tnӈf`Ebz4*KNI9_׾G"Z#)yד:4YW1tڣi@A%cM":IFU$\9GL.\:mK>wsxߨкG2:6B"=3iL?T&C*hi/N{$Y2.LfQO͹ޜ5A˨*ګ;9SSI{f<JIIU1.b޾GQ_`c^gcX2.ԒKmOih],HU.!Օkʧhlpn}E+ܔ,2f;Dr]E=%Ġ:*D{jP]ju;"4F%z.ȅh L.2ƒϨ- O"ob {0N0u/u`;g<l(* $JrlVg7ˮ+eImpytp Օ9c1Wul %du}]=^\s \C^- V-,M5VGQZ6fa=g%e1*P)J`_;wMAQMb#(dJ3ER/ilT:WlCD&dX˦=XF&ِ+ MfRCK<~Xmd"zbvuN˺j3xWh%W&$d BXg-@C@ e,Oc':uGŨ3| ޚHxb"E܄BlH D7S/%JB`1'D5o,Vr Ql'dUM5T3R@(qH(ʨqƂ <-YR;"JQ@;hjCBR!]R ET݋.׭TdKϨ`#/o $KV42  e͒&TYk\D d '` hB5Vczh2&xK!amtqXe4 O& @Qx nGep6]ƂE,TG7`ZDbܱjd.mxY[;>oƏWsm^,s@ Q/сwK }CGPB0=ASAʀvw0F/Z%8ܖS%lKRju.u 9z1^Bkڝ3ȋbFN(,'Z{& ;($!ĕ̱[?o,A,$8wX0a!dUR᠛FeR]DRa5{-i0V 颞;`;&S~R'O ]]%p?l}>{| u&L6EͶnzl@cgF [ 1Pj'ڃ9wBwmQF^BhWӧ H]M(y'Y txS]wS/ݵ`NWk&l=''&]lD &őV=\j˳Y@R?OT*نɪЦMZ:HYn78fH? GgZ+}z^cAeye39&9*OUnuG"3s3/2θR -vAy\@ z_Xx(drks#eљ4fVQN> q﴾Uw\#?tt_Uhg'z4 >JO ʺ!3d}ϐ>Cg Y!3d}ϐ>Cg Y!3d}ϐ>Cg Y!3d}ϐ>Cg Y!3d}ϐ>Cg Y!3d}ϐ>C盕!eJHևUFG̵Pd}Z.#V:7d}AYl./'x) FJx "S5SF06GhWD+ T'yds ,6sIEbF%/|1shzoa㹝ߨS>>MrvyFsٝ*g׽]﾿ oJ[I{So#To}v~'6{{Ӵ˳s$m?w0.tV^`^_W=yzMo٭[prw{suC\9iv{sۋq+wَ];q=bh3<775vNk4ľyBח-xf{T?TRJaR&;X6s*/A1T:JP*Cct 1T:JP*Cct 1T:Jwmm˞ Ĺ_Xq6*lCRBL IɗT"HY$WbK`3KGf,#tdґY:2KGf,#tdґY:2KGf,#tdґY:2KGf,ϗ,wfHZtX:({ɰt@o9z`9t0(crYokvmY1遳# Х#ܗR@ tNG2g3G١noyqQ7h) ⷕl-N0 Y|e*s*9c4Ęi%Gq$X121.~ GIh%u%3k\{lJDVoGVQe;xE7a=No8h؉gkaOo}AR?}1|qP#P:$gL YE0CVIe68fFa.FrgŐ#(6xY@# τR :<% +E(T`1l1#P_zaP[oح;nRJ/`ɻpBM^~Z;Z=JE O} YU<zRZNE<:KЌDbV<(lqT=Y|nǀR%XQKRjahI!t uDM$94k^4O2׍uz^.]cȚK`-!sw>&DIydkC ,# (!I~XB!WMчᐌC$oϮ;"y'{PI7]\X2Pj+(D"0bī#T+\F眒&H"\Be5j9 qU}UgC|W\wo ;sه?$GqSj&tm"0rZXܤԪⷿ~,?>~bfdjfOñeU~>_̴<1ă8^Tj%ؾ0y [Ɲ[IB:Xn +a;Hzms-dڤl#IN禚Ml ꬚'Ê>EBCsNjC'uJ#E[j<}. e:,&n2J- o 7 =.u,Vaqs]V+`yNjp7a .%sj\ƺ~ůaMyDqvqQ/Jh(h0CB %A) ]hX|s1k^L)ѨCt\J iz);-^vH/)L|YR;m&8}{Gij{%O}a[]|X넷e)C4^4Iʰ?N=_MF{/RR>&!x_l13;34%~oaED}_Ή'? wWYܤj~ʼ٭tt ͨ1ݿ6][y|+ݦaΰ9d+Xv{FF5odsG[S-#oHָ7$yl)Ujz|Yg]{dv;d5(Lrjq.Z_^/!gb`snv W=tr vK˚鴙ϱ~"e^?Ƀ.WoKgJ9>J'"ƦJM%1:K'ODNt"V:V3vO% o՗K ̥| 9Ô(9RX36W`޺s2^>&|;db<]sd~v8gM|CCXCq:뀯_5`#&# .4$#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r=#3r ׯr0^~St+~Pۅm}^ZQt] z+Jgs.BY@+VܮUD`aqLJi(mv! ':Fqlt~tS%<;ff`Ip/`(Ҝ@0L1i&TEc0GG|o9 _M'ʽRG'շvԓH=9l]'eu0~w ~ӥO'+ At6#2%EbOKx+8FZyN{>r{޺sXV:L,)#) XI"^ j=*&ut%>H#(FU,shW9lt\\5 ]c1RP4< A* @nEh$gȚUlDF2`ԅ-2$`9r^#&}$1;7:gָKY%܃鬺\vpL]3tycKs8oYyMy_TLPbmG ֆ2H.eZ !NZ}"o1m=Cy>WwKN;3d]NړgYDgS4"{Yrݳ\V~ݎ*ӑe;`*1Wߒ(id&zΕ9R"C`~}-mo ;R{-oٗ}ˀ\ `y%*=`{q1!6"E&o^-oƿ=dVys~LjwdH>+f8268?E`Bh\NL8Q G5!A/EaaP "ֽ q̱= qo9,kkyrzl)5$[6%8N9޺@ĨPsx'-s]UqXod|WIO`Rwy*L{wLW,c+ bִzXAp+d<F%E?K9K]}QqDEEFpv)""Zy9sQ 2:0Ec 8(E*eMs$uG׃r6Oʽ7DS81ubIyG#т7`)X3%ǽctSP˙&߁FA\;5H!2Xv(3ZML{!"8/GYLJRÕV1 Oj(zj (@X,6V 5f4&:,H RcٸjƺJ#iQjЂtS5ޜ0,a QN:@yC0cUWw|i\Y$o!9,uX8r`N0K %\wV Sd?877&˰sAĐhPKpa/o<( N*P)D8bT ?F#L9׉Swb0'(Irym(FS7  քTK\]^w׿yYsS>K+(ԋ}z|a`6𾘚F4w4o5Zh8,G%V0.hrkCcDh%[m}0;ۘQilʘw$~mBT?Wk.|u8_~b0֙L3cz}-՛ blo=PhA-1qW3g@ۋO0i.>GW=ߴYŞ1:[%hAv5V ߴ:Oa%#- idң/桘">Ť=PQ_"jKiŻjM5o/&(,iF!cJ5ۯ0GٸW-kbK;Īn/|2ꫳ_ξz ug {Y[IWs~K8K5?ܼij[5M;ܴQ/'J;ven׍}B9MsFoUe:َ/?jM|Y!DQjFJ" u*4b*Z'9a(3ADZ)>CVʥl1V qۜ~V"TFF b'nˑE^RѧJ(g:bzۤ*ZgmM\3Gϧ?ژWNw9Bd!YHC;r+szVJg$s'G1s}8# M0>dpGd/FpySsDfT*[TчJ/Ҩ2ĽWӷy~l][oG+?]-v9$yB_%ɐd'?áD]xՔ({'9͞oz{|*(TE3QĤ$|$g 3%Ȳ)QmX^ - e%͟BSC#ьKiM6b7x2nxtKvm,> lIYJJPz??7%Q[˷:E6ZWKl)UN檽dE V-"%&T_Vq2&\̢ fEQN!$HS=~C2$9͊[J A42gv|bXXL3 M;+Mϳg`u}Rķ-cd#" Bf,MJhA*rh(.ZG(p4epIm pg61J8%f&脱0c"^CYn8 !澠v1%)|jw처MñlDBhЮ5Toa<[tl~e;.^7v.b:*}L{i{eKGnbɋd]QBPm$LÔ!Qi"M.)Wo!䅫kerhBLsU^L,/{nV}u*@]L/j )m|-4؀(D vl픐dԜv9]gITd3R$A%P|rZXyHQBR #0*HʾY(DH[ ?ygLP>gI`:L\|Rl_]:j]?CVMq'/X쪵z7)y5,=S.{`qНyGgOxQYs8?| ] m9(BsTgbĮ?m "]jW,QJz Tv= >ۋi\`SSw}TDiTSDT_MnP{lgknq0,eٿW? 38yWokA{<4 ?aU^ܨ[..ɡ$ZD5uuJt.շY p,VdA@C@2Ў(x:qjj)ed5`b@v+G#B7z6`ѶbnRW?Oua|Al}ЫY3OѹbD=<^ΆYF2g t8>L6;=50h &ԼW zǟ@m.ZsM8*j6q'r]#-<54oZoNKu3b!i75ݟ@0-ܠSZ8pIjh]᠄T ~վ ,A &u3ɵǔ8%Z.)!/ZP ݙYgUWЃxn,-Mh̺ǭG{.FZ:~ЗRdBC*r0* D j. g5\1ze@a'uv%:`x:`=x J8"H PQb8KhNqGC QrmHH\r*@ܢ&h 'gJZvvTuSxig)t!E/_Hk 8Jw H($yb MMeܡR*!O'%;ό/꽖\FT`51(c$QVd %@j4묾_owFXoKlvS㐝uB=)*zͅ,/lSwø` 6DF=CF=hub `#j?ƷiUlnؼ9Ysx/GF I,yt mU7}tPihfϿtTQKd5+яCivљf^Q}"n '{pBK\*HGm=ƫ#8*K{>Ǿ>ԡᨡ,^JSvVZ:78BFGT&Qd2GGɩy /q0䢧9۩aN3l^x8a?8=z?#zHǩ{c A)9`TA×y&8WA_S;Jǯ3VV s/7;z|3Eu|~ _L22:7Z *H $Xf (;Q|(gz3Q4gEl=ElLɞ{>kҐrXp%{/ˤC6Aͳpۻfb_-fE>佊}"_F EL"ϟ+- p2Fx;Y,4=Fƽn`dE$IH0I՞)/:ξ+tÇz<^Ɛ&dƿ/;4'JsJ{MQg0{ULfw&%_3At"a"]w y.;ҹƽ ہ;ę,Alu5 gLM,f sSz$- ]L7nl{*@yL~c Γ(|I]3eMki8)5݀=Io2}9)0M$ƻ$b-QV<_'%uQ´d0-%Louv,eT3Ax< yB.I#Bx`Pod,Y$LG#uJo<ēD2XグB d"2O]blBopyB{?8=]빹xxuL2i<j K8)^s]ztsg3zFY\ /ѸBFYJ5k#-NNxzӏ!TXr/EBJz&w=/TWn m?-ߏ*L:l(F-*z/MZd >ڑXPc0(푍lq]iS#+|X b?ٱw#wᙱ?DYB; HwVw Dib!2cW5?e>hc6m+W O/ eTt@+7]XɌu{WD A"QRZ/MpZA8}ṣ OJh)w`71b '/I}f KjߘZ(6S!=˭B,츩|I]:_/=. bhh(&mG4Z+43ɋĽ=Q +I2yU`)U: gf85c2(PR ȹ]Y/@sf.>{wznwnWYT1qɝ)N6.5h\+:(( QJ x'4kEn@#  vnֺ] ^~7U]¹nOWnP9¸Wyu bO؟ʍPa*e'S.~(ҙkD-wen`Rff(Pϡ;WGP]PTLD]I"̚Ȥ!!UC㹎bQD h`lya-д[k,˥; tE3qB:!5\Ft547qЊ.g< ղ)fj1n8u1{ iӰJxYڦl7.kw6oFb<7T6&zXV CIRRYA9.FgFzlg\Jza;!$]0`p:/sD(IDkx|Ƥ"3.3Ntv3Zg';/rҋ2Hժ>ibJtvQ׾[~< / Q3W*JW`es*m\"gM8f{{f'p*K)(i@N!dLRsd昈3AFlNP}nb'aUc۪ff8Z(TT0Z$ AKe!tq))\ZN$J")$PO1&$ƿL+MxZ1ꌜ(KNCjҍ/șgJnS|.<|MK5oDᔚgԖ7\e<\K[Lޖ7Sib[^cZrYBd_bĉ󙖆} waRg"Ka,)%L@dTQXǨڷ~C˕;+/w $j *D9LFVBδH)!3 ֤O4P%*1P G$[*PHV#)RYLulvFΆy7BY`oԭ'UOͨDj'Rz[%O; U@R+%g sVAe9ي*dd84XG!ьKn&&t,sr:]ՑˮD v'[F j%]D ng<;R% #^Ygȓ8o޲;ec2vϟGSTޠh2 ()"G5JJQ{bsI+9YȨN1&7({`zDp:`PQ*KR]#cg܎*aag+Xh{,|P,\P6ȚqKJa r{44=?ҝL>؊Bo T hYr*.zЎ%R=8O(0n;qÓ2s1 h%Q3uBێJT HЦӮȹdWL]AθcKԖ[nףvlxKQ0^He6$ MBb,pe"N0I]FR2jT8hĚ$(# d$jЮs;VF`SP8S̭8ɥA%'т"hCedq!pqW3yxȷET{MLgpZU=c4 {k?M%*`yjqc|ģS׏k ?rv~v:QK/V Nbie09 h;C's=p>jE.;->fŸv8(u}o~yҏ1͗bN"wCi G,hn" ڨ<稢y[uջV)ZQ)Wҫe=j'tE ;ԣj:.fx]?!GAQ\6> WxiܢexR 3uOn46C`~2NYtұ uK6z)ƭWHP`*Btrsu o5ui"!m9԰J&9t*R)ޢ%μT/Cje]u:g9PBKk GlD'R(ԩp6()Q)FXBS{Y4y5;̺9U1w[̟>MylZ;|=' UoU}Kl4ieq8\o5E|DA?)޽=Sq>՗FdQ\`h#KJ@1k"^~{ykE]b6:MUԍ|^0zp㌻U׊Aԋ_Zh2.'(YHQ@98êF;fykTL?F՞7dտ. ջ_i#{c>˭|$ >+$ bμ@܋yOμx!n`e^+yϱ̱jܣh)9DPT.`SϨ T&XgS*s5r׫@e*@*P|åb gnKpu7r}Z< \ݍJkv pS+hkB \er%y.p5d*S7 Wb3+$ l*%FWH%o8c-msq@Ώt2LA&#ѯ?CM!?9)GEhu> Ӧ^ 蠨WCs`H%J{U'X_^88:q91?<&C䛶:i;k+G&?5l成wbz՗;Yqm"{{kpCϔXb~ ` g9mVE{▎X%/huMZ[uA$ػFn$Wť%>/$ Y,7cȒqWdcْݶWLnɪUbs1XVZfT2Z_T=@W уo!% kgX"6;_h໚8l6.v*ҵvCӚRݩfk-Kɰj`ytW,5 vK-dv u0g3t΄3Z8XF81U^'Ct_g*e2Z8XFhzLvrXLg*ÕJ{:BTr);DWXw hW*2'{:2+,!)p ]!ZIM*܊tuDto֦e{iAuC0sV,I>o{&}hne { $ D|A#^n8 .p2ah_9 v pСL(E:DWPpyg*}%P3tɺBk[p\eBWv(ۖճ]!`!c]e3UFkh QJbz:JkxgFB ށu5I?ޝujR-ncֺgCZlG@eWL5nej47]#`3ttahY"JAtOWGHWR0Ih*ֺ3tpL>muQʞ݊#`!yg*լ+th%1mqc+-мCtN*B>+duuteѭ6UЂ!cw(%k^-9]Zb@_p\254(=M#M+\wу& k}5FtE]~[dz=dU܏rOUFP1ҕRZv8+P2s5T\(9hۊٚéĞYU SphCv;Q w&2dv1 F|\ ֶɠ@ ~2Εg![y BUl]%8ZiS]u-f$ToVn &RPE<,JD~䫒6[_bsUg[Eq@1So֛[ͽ);?C%No^KjYav:- DUwp=*&&SoI ՠ}z%5طUAq[6|)/&|<IǤDс >EYbD"J}ֻAEiQـ"l^fW>HඌE?+.`ʽX SX&Hp=jcAa!"m<8v#kI):־0,y\W^3%tZKե;d-?y޸lyUg m Z!ko i]lʹ9e$_=_o,7E,T*3-n->ߺ MF;Z'MVޥs.}.M5oHR$'fH1IØ@"rC c|é+ߒuPy~Y/"o3%#SJAi4 I# oTFD%YHvܡX  8[\̧^Y)>Q]Lj0^,_w8I|9Jms19E?GziiyDHpb 4H Z悱j,Z'Nqc8xǁԬ"Uz>hf~*3}#UA -#"A'm2hz !R.F.x, j^_]L٤<[@\y!r|q9x_}u\ؒɼ\:a.hr3zQOXW0QQff (3*?l/sbX-sd}3[HD5l߬ZWxfTuJkmZ==\)Y~Pxu}UnGLOQ6;o5q`;kq{Ma;;-,oQiC+~̽glzdyGZu#wպ2qUk"u$W,P|/2iicDt[PsE?"__·'wdr`\6@GGfկP TM(8շ~rvJ$ɏ߽>ͻ>Gzp2z hh??E~Ukhjo]5pՒ&[=U&fjS? C^(7<>ym6L޾o׿0ŝuk >}$Ɲhr+J1etp\`MIBP^Z6 s 5֦ÚVk.yyjؐJ=LU~j[&gw|fU6:nW䔕4E1ZYR!HY$"rơ_idMGopghL&QڔMacVvjiƜxF#hKX?OBLMGm5igeم&dI/&mH EDf}{ͧWqm>-|*|8:xR{P2ИSfR*N6J)NMU6F4O&ijb (wA:C"e`M'cað18{fI-2WUf)!OSNQV|65Iz7gf|#I .r |#ݒ(*QQ$wAo!Z橏PIkΔ)EoXQsOʊGg7E]%19y`J϶UDJ&nz5)83–nL:ڂ4ګz-]>N9tg4j:e'BG#e'5MhBds* F0 wZP@-C E{rMK ǘBxta>l |X9L雒2"miψ=#%1y\2!+8ϱr* C/렜1[MԆeD3 7x)ʃ2ϸ2eh G"dzP}H<5,i|)BϋOmƤ|C0>| ۼrO9r~7+;R4*q$> Ҕ~ab?Ͱ'괠L7?@sX.ΦdH޹" $F (̊}2c(ΜyxWnfٻ6dW~9 6#& /z,M{σ*1Hd;A%/")M8LY=]UUM]OeqݫΊ Ž>,?M j}l/6D\kJ=3PKs}D;c$:s F8Bdi>\=nJwe*b M{c+uRƇ\9l'{[2[_ |svsoLG@(㈅O݆1-JeL6٨5#3\ʣTT("XP玄"W=w$|Z#ᣨ4:Gt$mG­rlg vƩWGkK$”᣼h)B,#T'tEv12nMa.e5]16JP̂JJM$ȝ "tJT@A!T nV<,Jp)=qZ .羳o}ya2rc5J78[tO)<-^(vUuzK!.Mx""Wj]Qm$ZDi[-c2DF=!9K0H)dйmH)2K6~"i6Pdg=>P^Q]>j?qLlSC>QYd=i90Ok  f$. M #@$Rr]DR h! LWuwMoז")"%(5 *@>`[xBnKõ=WV ';mH>XpbcLM5;l㋀n-q~&mѧV;;hNY':z,N]=]e.,QmAFpJN,Qq08+Ba'!rƭbhFk< j@jבb8K(NrGӆ;PG1e+%*H\@D 9 PoQB<ə9G-tH_clZk5sCқ"yEnI;p)oWX͒ni˦3WQټ{Y.foYKwĒ?]y?y ?Ts;ꍓc~Hh1{ l&NKܕ,Q[QNg{:}s߽H ]JG6)R\p靏;3* 'HA3QbἈ^% kJA5d^\`}8>{?U:!_PCxgiL~evvtP6pⴕ 'kTQ;[+DFJ>($ e|Qz}NWjcލƜۛlM|t5ʃ?Ɓ5ąÂ;'ETAPxtQ_tFIooK&Іɤ)A,A-);J&L zh [=%.oEK "Jh1{< :. #r=4ꍰS2FlI@ZXh.P፧Iqq,<9OH !ncl_C2vxp|5B;1foCt$rA?57WZ|S=U*>ݛio]fbEU32\I=uZغ ؊v pSukyQ* ag\:qg9,rh3\]`RF4 >F;AZR0IC21)#kfSM.:4k#gLkmg:ʫmkɄAE7o,K_m$" gAIx,tL@ τ&+i%p$ŁjE4Zr@DAEYQ:Bh`(-u M T.C)jAW;OuRs^"hi;Yx 6@uFârk}ୌQ';.[3WL|2Z(GdA8O3)G9medl}?T䧟Ѕ][?p3u Wd u I=]^;Sw̚ų,F'Y,!QR'tsGh#@[kORFkQ(dTV8bY4 )iEzcyѵIGؗl,W1iZ6=J{=lv ޯ_6°J 6hEmhKZK4AN ƘWӠFp;0T$52J1hښ"uI{ST)ƄܢP%Ol"g?.ix{hmн*abұ{]Z{,)myD 3xx#ci=ޙmxޜ1O{0в_^ M0*lQޗF1^*e&<}>zȹ>umIKH4243~&p9zLAȘ%H@mr]a3q[uv:1qSh7%dttVYVL0 4yc4Dܚ]_4*E(WݒS\2jQLUo~x}-raҲEp@{~7g`YPfrJDn}Q- Ajy\W+!ϱV@"C@^E' Jc! 3cDxZ&:{f*s ƕ,Rx32E,0kiϤu=k"1vH\qUkZZt܊H4L=9)LB"%+ \+R "v:A\ PRP9&r5 pW;v\J);\"UJ+NW$խIemWR(LW$dpr5Jv\JƮNWJ)mEBBpr%TpE Ե]INWZ9 v,\\T*"¶W`qe '+ "v+Hj4 ن4M鬻"2 VAqE*M]"Ѣ;h$1;104 mXWq?=fϥU t-$n?c>^׋2_`I sP.ԶTT֖iv~C~ݳmY!Ge 6s^c"7Q+ٱpH%-Õm+Uϭ-j&ؚdpErKWVpv\J):\ m H0p H.TpjձxLe>Up%iB$+,$+kx*B"w:A\}&=% K&=QU"u:A\)G?!\`U=d+RXq*NWI) `wrձLtmP)2ÔMW(\:cW(W+ Pm}-tNw:A\9΍M W(غtfQ:\Zh}0H*puTkL-h6m^ҠF3nSIrVQaxmL_ѱ{n &k{PA|Q%+ ]%F5kjoj6E={G@EK8kUG^L=M"V= x2"+RZ+T9t:A\ pB2"u:A\I$K)$F%+ \Z" OW eq B'+ *\Zڎ+TX "ԂpE5$+kuBJKdJKǻ"R+RkZ?vE*ĕDJ".A9OWZ?vE*pu`94ԎEBcW$_lT:q@+-[S= xMX'lAӠ܊K .46LZm4Լ)b(W)[=)*JuN=)3HL6ّ?P.?nP-ȣ U5L.vs++TJBdpErKWVڶT*puM W(Xr H.TpEjh;Heup%r%+lY2Z3&SIRNW<2Pz@`i6\˳^zhXޢߜ=܎ZW-#̸/Ydm,]SLLmjr^ݨh -ӓ1g y춭AlBGS1n}j`;vM~d\=xnX|1d NF7h˚p8l5Jwg/~vYb@=gd4{ո]c-(憎o zr3*<]\I[ ޣmlɞ 0sz`DAԧͩº,CZd*TU~\Zu'U?zFA5t18ܣPF]%X {_KGr0O=;vZ;?_/-9E3okC73F%\kRQ-֏;Jg|2R9HW(XtpEr +Rtq* gNWhgyB}uHW$G0tCNW8+mBBN3Kr!\Zmڎ+RiE(+k2!\`'d2"6Y]k%"=E\9<\{w˾&[#oؤF|fY|%>J;ͬvSvL3d֨5ggR[YJ(C:O$W$ Gr0ErHY0xj$7{l\5S҆*upwznsw< V~TCB+R "-{AipJP0 *HWִWv)JJj4XlZx8EOgdx|Goo_O->ޢk=gytY@9et& VNd4s=%7L%L}o%.guN8C )6Oj㢚7~V<*\D zɹ2!DK9^X^l_?5P`?uNa:&sBs!~ x&eSAGU%t!#+:WF~?Ѻ68EY̊x{> -YCmv҃s%byas!KGr l!p*@Mr^E <骞s4¬9ΰeRt?ե71: w|1\^샮JMEcgi]{ž&{e~5>pQ>U=UՃaͦŇ0绳*w;:F,c[|9]zߍ1qMjb3hD臘_N&x*W@Z>-ץ\VY zoZ0zg@,Lno8QFXqntX~>*CKoxl9){Wb YE&Ɨ|ޘ̀`|ȁyG׿3c^V6-;7ba8݌ki7_}\˪=kzjmT^3+ߦ:?@E*.Mum Yz3hctkkb|{ Ee明z#AӛizƇ\QιYy~<0W#76Gn]~7n+xbxx7*>MFU {{߾CZ=m:{ζi6,l$XK"!RZ??*` @a1@l2=+.ාK$*hcf,MpCiaigf;~}m\]|b"ci9w<-š\E,2LK-4NAG* ew -c6Jq!A: V*p+Jt sg sцRx`r6@Bgݦyy1e0˨}1[:l0IYF;}׷Z7sUU G'7߷25PF^Ə.">ND R!bި](,=W 6/U;DK۹w/t:D xX !y(h2JBs^}ZcrP̋a(YQ8SrYܗw;kt˃yڽ< x>t=nxñ~n|UV9p=j9٧<^e$Nȓe7D^x'*O|j["wJM8'uٵ'Ghw֔{; JCOEϞtfw2滟f!0ex|̗{\KɎVgjGw_y'iW ]|~ϗtubs4s8H}6]~k]n_bNϻJs ?2!)U.,sX%k4B`9 :2!@ b)͋KY'A ۂ%s8=G&tS)i=ob>0X2L0C{گ9cpk}cٖ8C_=Iszytx#0.Lt8S']9{{3߿dסtYJFJҹxMpBF]8d'e]vQ 8r|OKR4+š7qmGxz3Ig3./ 1đ]+~:/jwF`PtU-| d"c&:gWWdeEX0$T3BH/877Wr"dxj+Dͥ_~nltZ7]L<˂sz0r~aM.n0g,ȧYIѷQק*}>*pkQ97BVE*NuHOe!A_@v"I9w{p2Fx՝Uz7?s)k$Z DaU7 HYL[߾cUtI}n@ d4ox0T.L -Tzw1${|ƽ3V[kg7Y<4m Elբ1'lY3œj"26&>{4kXԙ&,dW ͗R?Ǜ vj6)9j,z-Is_ 97I0FF ֓̑ih{- x3!Ɏ *cԠ!p S:g( *j!S@'Ij[׈3Чl=Pk\rcscŤL6)/݉?8c /q -cFOlG$yo2_Wi=/'M#i,SISJ@aG).ɞg#Kzh;==DK2* C0lO* eKm ̗*zLϥKVCԶH™[2R20<2w6pm(2;UG#Xע}l%l_AOUvJ[Ja/<]ս%=]9hC~֖r_͉\|fcP@u 2xH9&=erí\ RpLhb$|: ڿ^Sb&G&<QC[qŬE iA̍ &XBJz& ٰqm5v9~jCNc%XלJswK%%!RxjuJͦ$O4v蔷@I 1C$$'3(X0b`᝾ȹ_Uaf`zw_%^hiʊWc_Wc~yz"P#_Z4UJ:bd^<5 <:: g$D/c^M+JШ* _\J%3)l! Ǩ /4OQ@ drŽF\+)G0K׿>LQ^ž-fFgUDbq=\c]YieYcJŹ'>u]"kzm^)ݛDųS5F2+(yCouv9oЄ^,iG%g{a#$%bU)D=(nyaޙ%FSj"'oy- NkZ-vx7OogT-35TDj^-3SIE[-VTDQfY$썦yG1DžEF%oU $AΝ5<ccc#)$jdRA9(6$q22k-Q{-޳4r:Yƹ-ĐN$J")KZcO9V4kqm)tHjqght7|.L0J`Ԟ1ƥWꣀP/P+*pRфR҄ 0$XR8Kx"ACj?//+/(V^D3p&j*D9LFVi SJS:@4PJZޝ1B*W8#jc NP,"RHV#)PH,+Ҏ9o> UqP9Y*~|=Pq͕4́`>l>II.I&r!_ʭl6DD[BHG cIQrAjL4k{c7""nq-"=%:m O1{)M{w^>,ET|:3$bJ#BR $$8pFΚKFb"X҄Ij$fXtluclwp`-(&>u6%"iI-.Sz 8!}%P:;  ÚDh]acڱ-҆n;CFWq8/ȍkC>tz\DޏlU5}MpLUx?d `>zgH%<@rFϵ,wJ↷ΐ YQXwmx3#teYv/J:7r^pđթ!vx,ak U c`逰x`Y@S[,} *7KT.#WSқ,dB`=_ Y _5gc(KJueOC'8\W dl>? "d_3|39W55ܨK+^nҦ 5&uG2v)m)Q4|g cTBM*tbԹ▱$NT<\Y̙C:jX<ʠZNmpu&2A0ɵ c.PHbh՞s:FY^1 v Z+IoyyJW [?؇2yW U66X[4TO":,O 0Q`~x<IQ ~9&9Xքw(9\K-9hq79yTld_&\y(d'ܢVӪ+xu~~:&2cB'[)ДD, pTȜQ+r28lCNBoѓ{ SKIKI~r@ZްcW 0%`*ġUV}$F-\}pE\B\C$-WIJEZ!+?;*KءUv[$-\}pŤiI0^yMf3%ϒCqF|֖K99`d>g\@N0cӘ ^/ _#9haf\;ԧʝrɅ޽}]?MۧY6:LLG1\PK D0R(u+ V1|w? <1g<{Wi'LwUHX=5.}Fg~{0oϟ1cܓ鿅StE RPd< JMlQ9%MD#1!,zsooI-Cg1#2Ќ|xeܧ_c@i "ƌQV/|_ޮ>KqՄa϶'ZmH6no6nvC=C6T !hr6no6noۿE%tW'c0żt8[v۽Or&DSŗݲ ӱd+B0G"`KM䆑` Xy$RD͒W=Co0qJSZ>:C.&y(4^+k)L@yM>jb w MX5u[Mr$9pus Awf?d_Gh[L5(A!ՠoLlUV}lHQ=z2\7/\Yd土pTX̵Q*g er+gI1i7`RMa)|ƻ%zeB" NX$a0.Z 4b+'1Ӏ=mgCO,U Himw12ohQOQj[ǣMV]}Fz{/-z"%F{>04 0tRO4@QsrYP!WH[Nl}8e 6DIKq4P)T%<AF(Pʍ"6:2,"1rbrX+?B"Ips> t߈KmkFyz؛\-12G79ؠW2OP ӼM>ZDS $Ů'r׬z镡 3ax'F2#)T-{zL8,\yWJ#9u1A-wG RQ(205k^LG6U,To;cƌL"^ˈiDk45[X4L/gCwe 4~7ݾ^\TգT8^Z^j4ywԙHpQu o^u1uڹHқ]ZXȵ )t$"T'5:1࿋~s(]?'WQzk׵{JP2ujuﭚg﮼ܠ湒aXkﮏy/n_UwT\yO!xՋQ-Un~]t}wԮ12i/q}ބ}6ZilsN;j)|t7'ZߩRWp#ߙ \Fn)TrhQ(k%GaCY$XpA0і)޸Fw&v2;P\iBoEQ0V&TFq]n=k|b>}e}iC*ul\:D*EtH1M2Lf1f 25:lT!8fFa.ZD "$] &xYgB) MI9a ,-f{s.>p&b"{&CDRIXZYd4c\>2AEϼX;`'8MխqJܽZ]Y}F?hO3<îGG2xRZÑK"\+OVhFQAy rf| =+Eo y |ƍ㱢 _j ݮnT~g8? ;e1+Kog;/䟇P̉%UyAON/Xl^GW6q|~_:_t*GI; O^0~/.;fG2qH:QOſ̠B7Oݾ?lFq CЙs|2 F%MR (4ON\b\A2ߔ2$BvfA2::X#AxН{zf^4'q-幸e%z)a1.\K>;.`$Iޤpsw (FG#vpG3F;iFPFxA0J՟Cga:\_+N..>{2ʕGGN_ k$a0t\ KEZMh;X!V7/IW?SO X[̎DNR~J ְ0M”93]NnfDK>MsswaZήpwsL]&e|梷E?Iu$yγuϓSZg.M>{[\y?o?;*\J|SmRWGD@uO]X*AFDح?%Nq1}+[Uڽu*wh] QKiZcdMyqNx9::ȏ7`"=| 9#Be̺\09>^]taBp,l>7V0"{j6ya/+-}\ѻW_?߳ݲ_'Osvz/óo%7r|>Zo=wٓ%#֥gƣ\ϥ>[w1ջW%b*rZ+yC]C81GOiλ[] ~ i&]z偮3S_mZtQ<7bpT9~i%KMi>i)zV\՗ZU@FIO~KЕ2{U(s"b$rOe|*Sܳi e\2"O!@k:f3PXGv#XV"˷i; ڬ# WtF O+2~@ F5bӎl\m>ZWqoivw~n8 ĆxseV΋ ,fXui;/UIp3G$LWqOQٻ޸r#W,-RdU3AE Xegl`G[RLduE|Y(J4 K4)b|+}OŖbr7Rlj襥" e()DzxaSşo!JqQ bL BpL8?ʾW*+R@Z)V+JK9)+Tߔ7KR>i!D1ƽ-9PH\m(\,?i6;j\!AWT"gFFf䣉sk3K;v;zyrdk[_̫bvb<>Pܗclc_0W& fO^~Uxtz#ugO0QkSq˄bb(.~Mvޣ Й0 w ="'YAgbVjZ 6IVE$ڸNi\XS 9NUi,{ lKU"l1_pUqTgGƝs7\/NMdtIδκS聄-y[V_}d>FXXPС^!` G5s6OAr&\/Vg9%YVqQQaȶU6FĹg!RvR,! P)zeA2/Q}jvE_p (IX{M]QEdBRc)XĨR.Z*֢da?/,Ր#l)5vRS b4ݒrk)d_ ؘUsB}AOVD)%B1̦EWc(9;fz )=mW}׶C-Hlh;1jvQYc~Xv닆]ҍqL$G7Q'Ьp4iT{ߊvṣhIS `rL@m}T"IDX3n+}{E{¾}֫^AM+t+eI}؍_> U%QAɠ2%eh>+=`Z8 [M!p(,67*$\('5VL|`,q8;ލ {߬}u3wF{fD̒~c5J^Q#JEJP&'ansH=c2at~P*Fg{n5%>7;1X+U喲-0{+NWgyJ cBT^7 <7JO_Fo_}e q- DTΘ0eӺl+.r0ª3iRXh/kqrk)5T8iT mSov޲O]Y_璄Z`݈կ#]Hx&}&V#'DW"pǯD{;t5D @R yӌc.( |V,KRj|/Trb⚳٧ylɆc/JfM4B &_44ՀR8&-s%A A!̥'_١$QbC(ؐ,$hR&Ib%!H[Bt#>p_j y#2/ݲO]lj5KtTm%Ԑ! d ;5Zuzi^DwR"V䦁TgV}J.מAXM9j<*N c[!kH EH߀CJ +q>) Fgi֧ @/C_ * 0hoTu6uuz?ren=Wް}[g[ j Iٳ>!>wrKCe4g]w{7@Ty7z~vL{E68Dog7TN:C9OFVj~I k<â`N6Kx%nvT^vFOzťj{#FW?ݩgy:Z5=/[| =~U6g`B^!כF]! v!~q⭆:6ŧ[klw~˷^τ9}WœߎNћ|k#gNo^XN aH'Cøa +K魊NflܰG˅^ytrtyapTgnu9ɡQj WN,mT}xvQWNOgg7~)_9?*+G4Z{~:?Mwזՙg}ʗIVOuPeщj^hzP_1꟟I70ը)>(GoH~N//~/z??u^~OR!,"@ೡ?4XCxˡ=CGoǜb1XS7j5n93BU?^V |֨t*Hw~tM|G_7U9T$uDL5*s2ʹ,VWJC؊pc7e2Ot;Ĵi^dB" s@ Te#&c&j ❇(g:;d4زN򐓝դWU ê}ӤXg>zƇl㫓GH2&`KhCp&}}͛nq@JyƿS4ZțhLܰGB-)Q̵D Dw 8/v+ek| SSxNnJI|pؚS65ñrbQZK37Qw޾۽,m@ ܜXݛA>λ5 =y"~㣱ͪJeמ$(vVS1$('O <)i#-&YCX ?f5hBt$#1Xc&gmn!1!rJ\H- |yR ؠElJs90ml`48w^hot'S;Ҳ>[ovEだk%6=y"pBo+t7o+tq׻Bw))~{W_n<4@ͲTkîT4݇}~oP˺8D4v&hג1'W'ourx2հ1zo-5Om(79%w@612y^k7L7b~Lz]+%Źͫ qgs>*.ԦU̕mfyjCk]#X;MQmN[pm6'^3R0noF.ƺݎFfIQI޻ZF5[]13Dozmfm;Bl,6yȽ{btIr!~ݐvP 9?VtrtZ工kD 2܀&s D_NF Lo72KjFv9~Λ.M{ rTz-Aʶ?/ - $5N7J L88(Db#i(eOZatYߪM-'C]V|RE.g''Oр.B-k Bu-4+ +ċ$4fIU-zy.%BAYcs.ZfŘ-Ⱥ6>Z͊BbX1 mU}7uLVT6ZԹGuεByU\U0.9sG_N"`$EAUjkp >~gHZW"ȱq nԍ6 [:tN.ϏSS6u%x[Eu֜U˳zI)DDP0I|9(VU,9ʮaV %%٪O4dZmNB ]era280c$oVFGŗ LD3D1yrF-7%E&:(Q 0Zc4:$V)x*]jM6$x]E4Pc?$, )0kl1 LtXYQl1ZjǦ/5v0QՀB,5Ys})u0R's@cMj[x% 8ztiAAx-곮U]\s!(ꗽPQWr4045(Qa-gCAQUj@xXF*-RK!a{)26]'[Ǫ=7^T]jXIEU2pLTcQ 7`yLaY?"ʠ WP/&ˉ.JV U_5RAa'E^*R>X1 )zSERr@(2V@1#;`jGˤcEKu E) ppʤT1 fXؠ_AF&F/ KdEQ T]QD!N݅e  ,Tͪ)m7c QuC$LjF]%S3 -=3$G"4#n[Ѕ=ӽkתk $6dR]+1{/VZ#qCYHI(e+Ձ3F0[QUjdOԶRDājd V."2HbM{0 Bc+P"Кe$MNd36ޓvhD)f䥨"ҌY7{#ŘQE!j$%0!EwfC<,Ӿ5tፍ`cJ!d8q$E!:m.H,=zP% $&F*?!u+ygjp N}GUQ~ԋB?뵸y>?pAєhdL&$4*;7?ѕeLjSVwU;^W[k(Ojs~N Б ' vo@+v (;Fc䛋>Z1J[شz>)#6# chiBSBO붺j|] w|T/k5;kݡ]3]3?~uX\1*t1}U]sK4d+uFQXbX8,}'dPu~TyU +^FRw'[&қ.[NFlhYmi5BZYee<;pٲTt;,QwR|X<&+Q.8*;guCb jˎGcQ{}>Fmf4 ΑƐM -&-dAv)U6"*SP}MvyO—?V08Wd7溿J-UŻY!8g,~?>=ſ֓+>Zg7=#X!KSWMMF:EoTBPɶHNO^\J؟5I^[])-'Ik'b}"l(c'l4[vD>b jeۛ!+Ԥ)"NlRFe‡Fp]|T}7{DžGEkEwEYQwZf'7\@׾myx]Lk\}WMSqSJM[E~A])&,D";yM2H\H[$C:ڹ-]оL]`zPZmqCognť? kHkRF}sYtbJ'x.l2'3z&STz2I)&{vEdY|_J/jS5:I4oh7rP,(Y-sNdhIm/޺R,vUzU~1Q2w 1C%ǐҝ7>-pfAG1t W_]Ks +QƻQ  n#Q:xy1c=)Q'rzM 5X0gܮ+%M\"׊e<1LW!zRLxaWe+sTN@s6>z{Vnp'Kp'dU\w쏦ɔuիad%kQ=.<=ROm߹'["tC}o9U-jugje:"cC;.vaQ۫%-dztF>.1Aq7\?)YWM녈БmrCuix[wmPBAݭDuSs&z0_\EE_Ad]W"&E_~Z\$ur*"-"pGq\eˌ]XY*" b[.߸N.-b_}`J''?N8b;L36c+Wŀx}u'\-e!ч[cF T9+6!o (oWm'BթzaۈQ܏y>sWP{Xv<1j3ݦ\M,P&t†t[YV¾*,UI*v&Tƴ,[#g<,:+ ƃ]%"nY"N"vqLJ訷Q+,@vw> {FL_?^@ hAZi,,;^``-) hc'axZ[G;, Cc +P %hiin";vڀG181Ԁ&c"&jP9 ,`*h(X5~PʅRU>Y/@M||b\S7 WHG0GhbWB[?n8;J޽y;7#o}U~Tj4wc`Ѡ2#={5 ^ @Fɓ'OY Y+0C0E?A3)(M ;S0YuTÀ0BPp@ w43r VhZ&'36> 0Ȣ>&"E}_+brx)Jq+0ﻭJAux?jlֵ+X'SD\*r23y[SUWO5Y(\xq5=.ZTn#I8#|ppv^-]pFѯWI^wF3Z4-]45Cak:,oa0 F0bZǣM=`sLV ZjɦV1VwI s`ǗE~<Ʊ޸tY B *# w)u)<%`' 2x{~c4Aܢ ]]=7$ׯ~u~yoO1Q^:} Xe,U&~0 <NBzM+V^4Ulsӂ6{] JfqݤYu! |&Gb,o'-g, Yk-kf"l$B-RÝQRT5g ŔKOsPfB^:jP =á~E+0HA9R8 ` ^RKDF[~swhaԠ$!lv̕["^S`%K3F]2z=Sf /jCׅet)ڥSJJ0"a"XgH> |KZB0Q:i؛P[Ut:)nqR,`3,,BN+L/ldk ڥ` Z.PH Ș8P=tHbaJ;fa#}b v^gK'wAZGo? dC1La[4JX;i,SuUϏⅤ0ag-qب9UEg,)X\!`u‘;{SѲ8-3ȵ!JZ}K$H,9pEV"4݃_/NHlIBPT(R.ϒw屳\ky>c93\Yl]jN|"%G^@?^6 gLQ/3OdVȽ˯TTzyꥦJ0pLF"3D j3>:oFiۍ9*6xeW3q3u&%Zf7T]W^9kI}(ISU2׌S 3(]?뫇*ِ"h`׏fQ!̠dyj ﭚ7W^>lQBp9?6<>*P7Lg5fYw.2rGͮzoO:lsNjg|=ZGq;#AZԣ0٠3Gq4ӆ,s)>8K]= ?$5ٗ9A eN%gVr6E sm80k3A rH# be}0z)#"40+C*%-{&Z#gKĥ5.\?!(vK ބ@8kYVˁ˳温E ԋTBUf?c`e:,(b$eZik$u٨Bp1͌\ta/!rƝ`e `13^z$œZHx-f:8d =!"=AZ3[X{)AIXZYh4ǸcdZ#g`Yo; pg]y^ɻCu Iv@fj`MF ]g5dl G2xRZNEV:KЌD<[ wQtK<|ƍѠcE,%DFt $:SM$976ke߬ڷ5HEz> eW~Ccts& BU&E>PAJ"0'3,tnQ\ *-kB:[FQnG /s y@-ae{%\Zk.b+oGP6E[~M`䷍S&?3O^ſx@. _|yWN V|4=P,n̞{Z3rp]IINGQh?[%Ue'$/٤:1L|Q]Žy]͂!цǤF^960QRN"x~]k#~!6kEe3>sb(׷I ':-1^!L&$N/&ПmzJ| Ku}D(ug J3[+14@VgeK a"eС JtP_! RR6R6MmPD>TȌQ+228lCvV>lyȾ O p2z+Ya޳/`\roIvOt Rthi!^Mn}x!OϾ7?Cy1mZʄ2a^!X&HSE57V"FMsFV;|||$y|FWl&':<DžggpV/d\Y8?,* L2lP&qxwI¾eW71$q!/0L> b+̄V2(&hXʉ}Ү4v3lJa^iGjd9HysQ ] oG+y1b}Z\S"R>oi8c pf룾 iWXmpvWXԤs)QS¨v<,a<^)L9[<NO߄W;0rح*F['N;:v$%PhQط#lbK mV[ˏh:4#TI䜻#TZg]DQu/>- jUW6(MACYSt`޹.6tv*wqTLdi4:'CS$B Ac@#*ĭche)MI25<iѦSƁ>PBZA=fF^`A 5kyXK]hjf HhRv¤:|Ko\zzWKAy  91 IALIHLGJ7Bzt 6FXN*Rʀ;m<8v(Cd(T]P`He68;eTA:Ii8PAK4KT<^`q,6Z4P4iňՂH Jvd=*:4S^".VdX 6@uF<(@By\U|Nƚ$cIah<*d^ph|2Z(GdAs崓MٹTOղv֧0yCp=8:wi܁ Vb9&-dB5R3(Qsq=':9ބjUe}7~VJ{;ݳSO;oG 6(\jQR3ĝ1A::&\˙䴳FFC4R9XV[\.it*Ř[Xrf]pzW 74mY-k|#\#[lv]9NYS-r!`IssGXF+Mav0T@5tj9MΑffwJB.mP.qg+dIjZà /xzmq3n ؚ>3 )( ƉuFt0*J^L(ǥ(О = 3պjb|(ƣx5Pn~K .bUWvJʼ|[L |d1\R99(G>ۋ) ;H f^9\v5%`:ЮT%-+,h ]eOMU]e+)JW)] ])d ?GSZCWM+)ttutQ7Etԭ WT52JiWHW46X2' t]!ZIi*lښWGWOBWf -+k ]e BW?'x(;:KBo]Ik ]e5+@7T3+622\BWƻ3JN;:CRV2d= m+-5tQrҕD24,^HQ"1ʰ{[qih+ў W&;3nEsŖnY{Ig's.}.h IČ)&i[*-D2S12B釷4>ŗ9C$.gJFɃLWIi (U(Ѣ DE=(QnA[r@u4p_b(yabĝ2,oήgl:>w{~N'Xx2#t~^,\lu%\L6"/^eqiG{],;i| "Т!cv0 s$lc} "';=>C(5Us}N& ?&S$^r-I*y Fbp>S6 n6o.7yѴrQzo]T\ۊD;nniꥒj-t8"j z1;dMK>͛;5*׍z]i~w1xorQ:o V/(B@ ̕8mkH1~$>z ߲qqy4ƒ@l.j[1lk1pOVKG㊟s+Y|<~Z4`zORE%2qW$ۛ\PNb1NE ڣE嶗;T_QZ_?~~~L/z6l4)pXq < A< EXDw9+*/97L?ţwŇ,^/~A,Io#߹;J&Edm>Ɲ\b!lh _WοP#nIjhZk#{QJhGHJ`,(OFy o-;76Eo^_~yد;#zVF-ܣv11Պnfr13)%ޔ0odɴ%XJg+INztV1Vy*O}dC6D~Bىa[G4{*~PK}e5 hC*gR$"H45Âң++/bʋBy6˔0܃5,9"i&h$msTݭb1"ѠhX?q-eɅ.:$(˵ 7&ʸk8Wm pgCNgYmy%8 A1TBێh *WۈRn2k&hbZ[Z=٤vʡ98Q;)z"tLB*BFdHhϣB]"EDsVb]H"FuTև٭~ʊhlRֈ׈F6c~3w,mgڌrFa@˱i2x4ivGɺűCS[Sdvu)V@%OҊmYgPȨpɃ0ix#O "EȌ*RHxb*'AVdW)s*ޔ4LΉ|ڢ[%(atD`R,54rgk:%* *7F+Bo>мqju69܌[]?_dx!f>_m?uzir+dF!|xW={f5f'tR#QmL , $H*Kɫ&:(JУ+(;"Je"0}ƞOϵJ@k0o]edZ>GZͳ `{ TRCeE[>D`d\Ι+69^SPQ.UI42*z'%a^'zm)5k9l׃`O`\̀9=0"ALjM\B>jbr_Y^Srjb:$eZ9BLD("䑫ĩJD ͬіI& lϷ,jjCC#Nm[|pucx!zfnjkOW#(zމ>rq ?pmw0P<I?4 *p%lo ְ᣷ se!|" ކ䃥Ni?c]`p /36v޿r⯄*ZOYjjݶ]@1ICn?w9wtjݹ2t-+l,*ۆFWww+{r|M&[[|un,϶>kwr\wL6Z_w'>Ir]OJTqY6Yq3Rp^"[QM0+c!UqE$Z,ur䆳hth<i {&PI8WǔvN ZQ$ēx@4X_3q'΁ͦtL. s?r}Mh8e}fxd=ڽ~ktiL̫0-S e:*Fg\1zT^#D#|=CzxAq @@H5u '#iCttW"U!rޢ&(x'gRXQ娀}a.YQٽNO ayWvJեou׃?;|?O?!ǞmV)Oe|1G1p뿾mx>er?ptycHQ_brCkSNJYuI F x|Mf&ٗXu~ZR^ nlJdk^rz[,v|uP[Z'-ڴbhkCh-84'=p 'DJ|lGe v~,G;!u ˰OS(9[gjˈS#י"EJS޶Bsxni3`9 FD#dr|IIɡO:Q Q`M,V$)m - }#kkcgۙW)wx6;\yM [<|l 4c6[31\R%J䣖ٛ`lv H["D .H P C$cJid*ICs3_ig}|\)u puVRWcym!GB _QP&X*t Zt ej/6~ҮLu }]CJۇLpv33;\a |t>Π䔺HI0Z\a {Bm+ p!-ûH%)A")X,2Q Z!S!0{Ƿ [Ӎ\a_VNPW U"zV%\Iҡ()d?Xɹյض͙ %ʺOѶ6ߺ4W#ڬABZNWj9]-t)-围rZNWj9]-)t@-rZNWj9]-!͑%H%I4lْ$ZDKhز%I$$ђ$ZDKhI-Ibk Ȅ򥬶C*M;d\kv=>?@A-aC»Etİd^`YkсK1jwBjPK,\d3lG%>UFPAOIQJkZ2F- e f & [" C* WgUxͦd'}s*4-O>Mg߸v cbrI066\>;Qu8 RHd0l] OC{1*dxdBP FTITD0nP,d/@mڃ Jn: ph% R^X $UHEiE2:F:"2Y*u Xi!e( ɳ$S 96YF,j8t*`,{P qk0PcS(I&M!(IXHRsFue_}}҃Q\שZM <ѠyؘճިRL1I Rb$I* F!,݂9*T.jrkjDj `\r$&W2!J2w67xrq[`cSyz3yx"l{M?Ngt_6\+9E?>RYѲ)ρza `4v",^$L oy/ u!nB]녱fK *gc"T5l(UQ2P)eKu*)f=yYOOJ2SZCxr\#*樂vHƢ1@ YtaPa J֒2B*eC.').xP$_w=v/9gGy¼1Jo;~ogZ}>SV##@u%B;i-uNǝnqn 4J ~IH5]M.Va/D+|~۝۽xVtY+[]C.٣SڼY=)a(9?7)ߌ.=Z?{կ3/ߝ^\4>#s~E q4.G|n/8GF*/x&ύkFFثG:Y7Z;..̴Ɠ^> YZ|2>G'GkvT%uF]R*{>i#u`i>hRFUl9W4 ?J+AV7n?Λ-w]z?pu? GU E<ΣȈhrzM[T3<#G!3_XH?ﯿORzweFU: ܙ_Bw!ೡ fyC;-|ńb\o|5ղZNr/B@ӕYo/>uKHեkek03c$lj W>@ک5 J-J$}V)% x}QmFzhgRt=M@vQG]E "2k(OaI+mڞ {:SۖĆL`ڴ]vvjgy٬ja;zS?\_'sP c̛ѳ_)MNs}/~YyTNW|%N_#>3LN?ލ`N  l܉\'/?}7j_j_Mf}dݨTTc)sB3euQR^ ^"g}˳y г.BbU r_KHKB>w..ڹ:=մus+:b˱ iw\{WC{4/}|rZ'3t.|xыW?EϣbqSj}d4"Q.>;z/+vOW ꐊ5d@8 c.jrVlnRh?TiЅ֎tAJr@UgC,XŁHּsl3qm2t]1z;l=E9;,ČI.% Q.RG4>I4+t:yax;5Ŕ.^p}0r%+\c6w#c FaNilE]6S;o@-ʎ)I"tek{'(ȼ1}8ƽ.[,]Xoᗷ\:(/H=A! 5$g @H[Fc}}j8R>Ҩƌ@42?NF?͓>;/^cv<0^Ti*c튻‚<ߪ̄ԷGK6z'Wn z]a;ߩ2aW[x%}anTnvѶo#z5Otaf;1P<+YڮvX% *IkպXZא:߯-Iߐ':_(;'}LlbN %,.`O>Y :-?q{boOV6(eHd@ *b"Y%f%c95fz22dH.BHʥvbŠA")9CH6 k0rAZ2v20(LԑZ Q>|@.BTK$g( ]1}0 C>w{x(kt)dv{2BG'mY}a XL}d A'/j}IDْ%٬ZuK:,ג)E 0#q{!,x^;>lj`W\9e꓌'g)OKb "ǭ߷C֮v69gFo#9xIkց䕡 Y Ṣ^ɀFM/,mԶYawOmZ;|ZtpL*D(UaeZrg|t`+_OֺL2~ G eZ30{-#cԀ'%];Q*ޱCjWjrݼֵ)LdfWt]q]wWgRЦ#)|l '5=0࿋Iƻ sh]?˻NWt:o"q]>E2u.K=/1ТF0W]~soU5:_ѱ7L dڞ lcKYn`Ϸ՚[ryxy[ߑnN1s|ybП4( F6sF͵a1$/;'@5us0? 힇lBOsнepS!F9 e`Acn3F[0 {:LF,B0#BXYL)} #"<0+C*G[&:ebbǓoS%^DQP'=>guRI8M/MTKBUv?c`e:,(b$eZik$u٨Bp1͌\a+!r=a e `13\x~=a ,-fq4j <ޔ$y]SD;j{%seF* a湔#m&rG&?rFQAy ZY''<- h2{s-)}<|ƍ cE,PL MAd $:[M$9 ѢtqW~xw^8jV6//f#7oË|-^Sx};M|:) s߄{UO ]>ԺՏ1s?_Jr:}/dԴ ~ fߒ{g397No^n6~Tۥ)P8ѹ5M_&[:K ycFّ6 ?м5cMxcCcKqLH8!+9:Kձ]lܸT]!ϊ㺍(Hn{qZrPy[YVTo8˭BNj\:`RlS* }q}G_q2)?QD&_AzG˺AtΟ /l4*Bv| bro4Y:4C(6]QlXotmC7%D^dnָO(ƌiV)֮5~z·-jZߩ~u4?qL..؞|M㫿"n|~keLǏ[Ri+η8ff#`<9\QAzPk&UOXygƷ$ɴD*b*e|Qōq[MÀb[ֵ5+@RO3EPU,X-v^0Vr̆(h@"\Ł*c%*'y>-k=[%UІ(d>Uyxi!hdeG՝EBw6<3; %=d_JK٤8R" ^,vH-s6fFgól<}<*q~ Ȳm󚭯cS&%o 1z}G Tm-x6un*eVERS!sFOREiۋ["*}暴tz+era޳{0/wWgkIOt3'Rk%bz1Í<=v^ow{L,c05%a4UV+&9NxQ <9:8m_gZO^#F BJRcp.^ȹ(9?uQJc1F1sɭ`)$}jQ6*𮥙󢍌{k38Xp"A spJf \91`J O6yk%"KMv)/XkjU)"⡮qx!0 %`-Oؙd횂ѵ̱H:҂!ugEv9ׅd"R3H>ɽDc=}%"͟ߢݏ^Y{9%/Q`g%̶Fp):pZ{^ `rڅקQsrYP!WH`HɰjȕdQ6˴rJv\MԓCB="Wl:pڛT ݗO5ɕEfaS4aoȂ`/r2)np09sH[ ΉuXFLWy[zm/lq!;hnV5+n5Ah47Rh 4k4.= IG lUW}!W Z ¸'W\x!`Jސ.BR:JP6t\=rŐBlrz_UvO6'W J.{rG5&C@M>h`HPl49,QIA) SAWa,O'WXGY, /IA'OXeUozTG7 н|jAzdK3s¬`̲0ef(ٿe׏ 8]+&o2lְ DO[4`~/{D 9髠EUajh$Lqʏ0K.A#)\ pNCC)\'_Qg&"(>:~)Vc w:W.}m^򃛜{ϣnQ"u${#i&pI3ˢW=DIX -IkQ~TYʩ8yy$[Z!S V-`bb?i㋓_ +xm޹t6vk/ ĐX @ :@y73^G PL! kӊ[a䲨O)>LԟrX)ʆ_M zI[_S䥬1rgr5]6l tqJDr_Co88 ~IIS5nZo+9܋s<|oa7t!D.:b sU|B3O\9,)=V9xWwbc\J! &\!)?$놤%!$.7N8nx<1r%\Јsfʵ!,ҨI1=O&)/aJ@76y1S*LjӅB(kB.0Z75AR鄡ahQvrPoVT&wnm+j_mE35ϧUy D ۋJlA pBN+5,3%4&B>GSx12J&n;+,+]Ny-/ܪq/>0ϳFdxx 2G|rf]0"7q6 )tf!n@C4[1e-MSkMw-iY_$s6]tĝD.`O Fsrm5O֚1i{oM4i5# 9MW}1]7$({_iQJO!bloUWM1@t'W J["DO lb Jj dX!+kw7s0*2suDar$NΉ7䬑};Os[] xFy08s) 3jE8|Z-rHۋ˩Z"FVS-_-7`IZn5q+W\y4e4ЍJU,Es@RJ;;;*"g@קoԆmY2=91&L!ip5jWPK&WCpu^ԉpe]4<rO}P2֎+܅ꆫcG!N+awθYp5ջR̆37mE4Tw/z}7C¢p8pQz$l O{bۭ\vvڛ{_EM^ilC; xn:7u5Kn~΃ާyYs) +†y ^Cd^{ik%+F},hgoD`}kP+_*fwVow-~[vƲvtv= eїj9|{v[n|亻Y򠧠?gegw6e|v=i`=>w|7_r<0}чx_,ܩw!,+'gH'XoJ&.hV茍 wj v\A];J6\!$+ S ?yZY}18TvuRcD`8 WC|LN{p" T ip5γvt*ysW爫h|$70U4OZ֎o,q%aeSх5 K+ >oAU<N}݂m^TރrFēL5'^D^(N,"/S̺|5k7'zUj~\WC-ŵj\C.7\ ,=% GWHfPd*6\!XWCp r,jeP qqi E4ru\ Q׎+`JCJݻv\ '~&BaŠذ{.ÆwN+fjӸ+ry>Jg6\!ԉ n"\UVg r-M㮆] md8G\8pdp : m:G\E4`lr4jwn[j?O\Doz{ ^^t|)YՑ㉮{n/Ynn6Օar',l5lLx ϲAM |qXơcOEO|@Bz,T{JnJY[%G *9*=q0 %h: 74j_;ʕxepepip5Ԇջ++vVWN$8; 8? xCv\ kPfxvT\Aw4 \jU^;h7\!hDCWYp5:v\ q<GjiܕjnJռI+<Րe\ ֎+T֮WAL8 4jOt@B[1x>t ^C(M _ ]:Z$FCV}HZhy?LmhYLU@0<;\fjYm288 "W3zy\EԆ^RN< pe7\=u)E_b Աp5? daZr& W+b3+7lWCYp5Ԫ];7\!P &S  jY}1讟 qQED7WCOʸ]# pJ,nw5z];ʠW,LKlՐ{Z /ҙm,q%BKK>©HV;[-&;E;[x@=֠}`vkD,e3-}*^_~ 1U{hOwusѿ__5"}Uw fc\OzWҧ_({Q0W4C݅]~{ݴKe0iF{wsp6ͯ70L_qd{oapwF_1S>ށMk^?G7'޽b'^_(aa>*@ʇ;n38Ӣ/|Bϟw?f|d0? } ~!޼ECw{s|(HEm_ 珿&+2ǖ,Jr7$Tm&#{gr</݇DPro}|?_t+gr}_#> 7_sf`}x[m,=Q1&R5r =jrLl%Ǝ¾&2wBVME@1bUE($ɤ[-hsbtݫc ccRkPLȁbkr- MȢ5j'+St)J߫]\ryR틥EJybr]$ '%M9J#N͘#$B!SmcLf cK}r Gąi!S2^gHx$ ̑^=Mh5r%E=!] cM":Mr*`G$!ƬjA{]G&>#CQG F$O?ۧW/<ͥBCK{ü JƜć9`špn@cVRrgok*u!p4bMw1= #d :X<,wS-!v80Ǒ֑-~B_[(-jc5>P )E!:VKFCvNJT>W T+JxLx\R}3QYeLoBuX2t6L%` ;w-V8!خ0w|ͰjpoB+" e º7JQ1: dAPDdEKV4v&߹{f)!n3֜x{b΂# ݄cBl00s1k)q$8TR 3kBh*y ֑\_`J0 7PP;G\`#)#|ԞewD"=dH_(h|U8匂TgbUKCTGu]0%PS+էǠgd0[ьwU˙WGH TZq!J(g)6 :#b $uamhjE}lBi>#h~#:3xHcZݭڻ^̈KSM+f#c Ǧ kDBLQ0s7.?cY"36s㿟k;WН3_r]w]A-$ >:%:ppi?7#XLE KW=$:XF SQdtXQ_|T(3{FyjND.(2*#ڂYAl5tqXVq90{@I}&X@ɹwx$ۑ< 6|!Nv#[P`[Dbމil@6TF] #H b"1t? y'K3I4B2&Xo,&؈`"]e)Q Qw41۳QR"TepExXKGanKԔP2vv%3cEX A WV()Z3>z{9PwrV"A ͡?患%B ImUFjhI>+ww]>hKW~|v\?*$ckyu (úI''J:<'{hwdJY:Z- ݚP'BI56#9˫4$tT{Rk80)Q^";,b樇 |yB1C[ŹVwI,WRLwWz@() e 3z<`N{XXAwܬ7a4E@ ?rkme`\`vdln``^BdÍsFO'VbYJYpшH~s <$[g dSQWе{H~\j-P>xIk |ܕ^5D@Û2p(K ZV2AhC3zQ+By~ZO a7%xS@tQci(U FR [%4" 79mj!*JપVv&زUձbhV77ADRȰ Ē^R 0[[.B83FL(|"B S Py c3 \5za"A=\Vd 8K_U)eKnkOJ04[[q, _}T*sC% }.=ײ7g>,C̥%#A+Ҿ|ޙ'4L= Kv0Ozqш9+O^ĝ#v1c˜K'*ٚvIҠA;LbHYe.UdtZJT6]Fٲ Fyɹd汪`T(dV2:'^wdci ZN#Y_RJY*xSR0CP8.Su~X!2Gj<D}OfCx|VݣuۡqPz~`O`Cϭd5tpo ]!گm ԎJXcm]`Yk Zw(!:B`+[]Z)޻Bc+2m+E{AZ]!Z'J;d5tp%o ]!ZDWGHWs&\ yW_Zk`dzj]of/Bֵ׌C+DyhZ]}r*[DWXWEWƻB "JEtuta l|\Gr-s7iMhV@Y a1l- ӯI\jIӫSmvP?ia}z8Ml)&bQɿ΀@'j nseK/#:ک2JgVP[YɫK7_UFown K  +xzǫ4/?%kɋajjaCGNG5u XVP 0z/r滹l S`M ,&z{zj*at㙒ZZp+cbTfxv>U[],mnpm_0eקxU^L/u|6]f(3%,t r>**)QB{x:\/<[w>c&\ {{{1[W}9R _H⃷σ' 5I%6֤5]nFlF}A&mʃaOЃ؋EGWu.{4NyV>_뢑jV΅^:q:ʋ9k0fz:Ű.6NyMQ4.~/.)V7/O8ͳUw`qb%'pB#"-R@ Gˇq^D_RD M5`gM fwuO@?<˧z?<?۳?=uO0.al -wIvj_UC}S_HV;tjz^Pox#AR<)=BK_y¯n{[HHy$d2z*iD^+jVkm1F 1dboٰ =҆Â\0yby?Ípa:|K5k}>ynLՕ &Uv,LXhUTavNf:K!=3qLv4&.vao6qk{aG •N[-a .Ovf3 <+}:L_m{=PaZD7:ۘq[u-?ˤ)`{.wEӤ"yEaIorYV^`)" bą0wU2y ^}w-k뗚p)/AJ7^C-lZ:4{K] PCN1`i;i;pV쓰n֋G:Zf;kD"P}e޹y[\k%3Jnj+2e%E( g¿isϗ#; Gg3vܲ*֜%H>rwn]=[4樱FdZjX4YK Ɯjы4vQ&lY rj7F[0:PۊkS( Η:r7{WƑ E2y7AOyӤV,;[=:xHnJ=A,Qffyj긝Ё9H9yY|ifo@IL ]++t6~ D~i$ հB E$G柅zrfNG+@0ӗFp^y(oG6?%םM/~_:F˭(l;f;F0s0~x|x6+gK:o`4VXM4 !8ke|ыeu`\j>ոb5Y{0VJ!KAQ{+8 T4* 眄dQ!D4In#R* aѰ78 43 ֆ,7g2F [cBH~i)m=fJ+lV*toitOA-6ܾ 瑠V'R"N&]wicg CV,6`kcɉjyJ~ <`˕WG+*0ḒsA眹N 鰎S혩s1Ё*@)ɽ J4K zIuFX.C!{KH/(fm5D#\4. p@T)`lmd(XjV[V=ٴRd=\)P&.\BZ{-D &3' ʪ0+m0x8C&dȁXtɐ-ApL*C,N=&n{ؒE V1GkZD^""qۊO9賳\G!QIY|w_7V2A,Uh&I[LJU-"I6@8)ųfD.vG褹Ɇ@LZh8Cj{V{ӦScuV[%]]8>} I(QqB&R*茊#DNd @+_CcV}lL2麟#z~yEG\Hя[՘vx7H+jJ}C5ljV ohS/NNQ@_^ 4XZfyL51!@f/VҚ 2AdBEсGRtk,R[%KOe [˃36 ODJu%ʦ͜ghA}zsW[{0@^zsR0F%arL̘[ h$G{e9Um+ݟAi; R+ IivLYjiɉ1siz:ÿ*(x&Mg@gRwn{w=c9`YOta{ءT l+\4,h@8 T @g)JͩmUJ[r8;\ [/_kj5^L{/C~UR GD5x *-hW.}{ zsQ3e %YG<7a[Dݵ} Ksۖt^vq\bk}}E4h\/ԢgOEn}ox(dgwW8H`+\k|K# x*%L[.5P6Q;꺬mWJmJ{H83O@ز's;o.zc9P{p1@:dK,ou9U P9I r$\wU$Sσn\XO V=@ 1i0T2?#*e( :wb"Ι+2Zpk%FOWg@ep͕&;cw , f9\9~,ĴltgL5T4;ix9 _RpMkZW8@1! &/#Kt_xMZo=E v~P * -qX*Cʒ[&6NЙ/md:0x#z)xkuٱÂt;Z,UO%]/r?8:䄉 Nh-cv `NZz@b)*AGXcnIr>t`ÝqKg|m0Da7e=(&4Jj*I*1nZ0dX};E#e:+b,pe9落dRfLNYHʖV1$hkNGXݬ=8eϣ lDې-`*̍2ʽ>hgx?nvn=:z$tdHY\J q5RJ:M)410=2_]HO=ϫ$ò㽯{o=z$e9* YRGtx씴dI5PVYfI!IRCޝ`wg%zw:٬ k1e R$MI7J$, -m 6u' \b%Ѷ7mJC;޺c.j?e=G7G1b 'G(at<<"1ǞGXt#gZCw_A6-4'"~j x볊س/ ۞ =vjЁP J/M`҇@K"ro F9ʆMiߙ2-w)E\{L2G٣k&Ξ]gwV{?NY8{ۅ^[[ENV8mΰSY_3\(8$f>zt9.lw~l' \pm̫'3ܾ'--nϾ]>k~WԳ|]T?@Yo\_ ݷ."Vcs?CN|L-s]J'okl5Gf߈ L=k'b"!& L$TO\Ϫ~bנgB(BIdY[b8JȂk474|+ҷSBtOLa4b?}D]}p>MiDw>|`㭊̤2GP!ǜ+C2(/@ rox4g-fyP>(DG8[ɔҷP#S83,l TMÕҲ߾r?Of0-чWG{7_o)}ׇŸJ8.޿.j>9]~t.F}flbʕ?La6Kޜ{6 ⚽/?^s ~y=YBW;.DYӻePva*8$ן'";^ž~rz}sa:< (? _%7λ=Tzlvй:Υ,بةu!a7ޅ!iDսn; )$M̓ABC۹k"w6q&gK.wwV{>]q]SWI N)Ď`1.i5t}O@'%\0ۅ}g΃y4}3!G'JN[/ݜ xl_G2gG?Suߌ(6ГhPw!Fwz*xWJ2;Z6U| @Ա1ɳ8RHΒYvl=tAus`rvL RgJ>3&T[̡oaog&<PλǯU/FsЇ,Իw#@+&MCLZcߚ 4-"C88wqNcl|uryTXNsLtBqc6*QTPKC V t^5!lue4Ǯ$Ir蛞k^[;=dm#Au9p@"kWLo>ާ' ٍ9Sm}MoܠUZf{_kl"B}]LO1T-uuyCJ6u'!*J | *iu_5:7R[>̇fB~ }Z(uMyNKKxo_.ޤs\l;Gڠ=zvLum=cVw7꜁6}Q޺H})u;P24Q#ȓjH*|}r7;jJO}٧Xբzد4UTU,0;Jl*Լ؅E1ݥ4A(&glo%wLvLyŞŷS}^`uo1{}Vc@zйSo}8˷WOӛUхU͡2^~pQ#c 9=U|}TCTo|ҵ ܶ?gSwƞ'qS!(]I~ʄctqIrDS_ ?xCx߷_S~?wW&Գ 邧o?Kf۫ir==AV󪽫t$Zm7:VyeY|2(%g{'r6.X:cW.kXV7F}dl#>S_cW1?.沼Kɟm Dd]~M?Z2{bǏן|v{rz~wur1ss{u~1⇟~\nW_yU@J"+QXպ# /T\!?•ȅb+Q3(*T\W*W,8hS D3JNNaJt]j#<*?Dp(RKu&w\J_׮+ $bpr*fJԺAQIպ#;Y0U XnPT D[o*惫rS LT8VX#{?A3dtI*guL~~tLʭuaa;I󄒽=! c+ܰ6urcړxtKmK(bm=6ζ!0`t $)Mdw`T+\Zm|6T\W( \`=y}qr+QK6w\J| q •|G(V+Q;Uq*!@PXGS DnRpjѪq%*,֊GAXW"t)ڕ WsY@+\r62Z *M}28G\ JZm,.'5Nq%*뾫y[ x?Px p *ƣ{0oD³Q%i!P#bnҪ#eɛWeF ƉwԤv8y5RkGU cݰOk +)&\1ޔ+QK;Xsj_  & JNrڠ w\J+f+lI S)W ZrǕtTq5C\9PH \`z  J+QjpBAvkʱD~UZW3ĕ‚płbprQRp%j+Q++*PIkWS D.c]ڐAVu]j#ȻMT\`bprRp%jW2݊Umi644J/4; ()؇kj7 tA@CaZ䖳CԢӢ29bUF6lkeXe\c,B&=֮7+T`ѯq [yj/Sk\'Ӗ\;9zRVxGlJ((hʼn#hF%5M*s #<]=}^#"قpE/+|)=XePj2}pł˱X.b+Q;QNޑ*6T\W @I䵱J:(W!w\I"[9J/2Rp0S;Ֆq**f+΄PXrprڐ;Def*^W>A z2(r}1Πq*Aa q!.ɺ"4P1b+VkW2ԍ sUPdbp{pG.\q5#\y6? sW=IF[MLvZkZhɻ@q@#5n'N!>Nnl(D2TVeTa߮* ƍȵX D-qTIW3ĕ2%؃+W"+QK[W!E+L8,WO\{ZgsǕjr:pł-JAQIPq5C\a$g CJb1r$]9T3Ȃ}Aޕ+QK;2'sZeCI >d\_FQK:w\P+rY0dP4LiZAQF9*XLIPu%r}1OEmΠ uj2эpa[%<^2 -˺@˸ld^lR *@˘^(@˨N5ʯ-vǛ=RN0cQ:k84(FOijrܾ]Φ \`bp%rt\U\Zp%*W"DTTe*戫\9ywl% [(W,wqjm֕zן;fqcp˫/A!/_ 7g;}PwyKPm||z<08 F ǵ2&4+y'tN8} f9[On+}H{w-oh@)Zk^{]ݙV68oUz< IGZqKPy|YK2<>`=ON~two_]]o[G+^v]63yxh#^X)R8N(KwXq[yyԩ>D30P7Ⱥ*-G!vJޚFWB:+oCm>Ep=c+LحjF\U)6㍧\J3Y3,شZPgko,~^hhrÆښ>TC@(.EMҩYk;lPESV! ZV&0%Z=5Qualis]4%ڎ(9ٜ| TL8]mJ=5ի$աXڊWZ+r Y{[|'|.wFJ-pϭ?">Ɍah1Z[kW]U=QK*P}DF0{}wM֍6.M@~%>3dClt)CQiG PԎƬ61"eG|;0F>8I)5b'ܭ˱'_s̘HHI?̷2ʇ&PSUTR $ڨfA_:DHp,Z7>64hI5w,NF+5Tr2呇V9[0fBQBn*Ht=) ;*D{%֑]Zy;aFeB,- UHdޅSuT[=JNwh QPZÆ]޸,MJeWzPyUJhܕA[,X]pl5MC`1uR6`Ņiu#gFCVjt=t0W!9ԠMEX6[ e  \z, RL;O-`߬jځr)W̙lPQ&t$v BxiJ +Q+-"w@i̠b oDXqvL6B@F;=5(!Ȯd܁E7 +Zw e2a;o}0S@  Hp: 7sUuASPb z,,x$L:M;*q bpR꺣Z'8TR 3k2Duǥl 34t~ :jobE,^)(J892rAj &YH( :j8u)U(HD*.o-TuAT"F_B Y̼cI|ِˁFjDPL 6C Zb@$Fu;)aLDjE}∽S ha!:3#aPyr^,KT1KTei5 !pBK>}M@< zu{K^\חlÏj`*f=f>y; fL{T^:OPӦtdUmr jmlAhqU +0똆'$@zB& .Bp(z+R|$Q*LyM,C@b\t0XGiy{:2ɐQgk#ԭHw 3hGx!hR% 9աk4=!V1xaQMZ^D!v"0X ?&?>+|cټȓ)ȕ'X1o{g%BE4=RS.ny1i"!.~n.q`\GQu $$7(},=i&.fX;6@D]CzxWjjXǢ-&?f c%؁4yB4ΈIe#[ qkE1IIbeUUPZBMC@;ʛojތ~,*'a1=TV( Aw8+N bl[NԬjhG ռB:iϢ;;kUFP[TZsC{zkJBAv iߤn$l|7` nQ}Aoj!zP%~Q^y~۽Xt\7`UZnio.׫sW&ٳqY0'ttYhl6sZ}"\;Z<,Fݚ6fY#eLhQAi? @zRF6k80)QC^"$-ҐQUr#as"줓Y1mE;XLMYڃ DG㠔ߒU=֣6q1왍C`?~?y Z96C $\ 9=R*aX0jr c,1@ZW󨣸I p֣T,~ xm,mT19cR=ڂӒRb+5ށ&zPkFQ{&tvb2Վ *_K@׶d҇{6X_U>ҲQ]b2ioCo7gol>.zvu C5_]zbפwwV6?͗ywo6뛋/s?}a_5o7_juW\_~}9ibƫt.Rvw[}1_]۶6_:ɟ={TKߠ\0Bp4a>N u~6N MtN Rъ@(N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'9Y=q='h?@'I(N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@':ø8@@;e'):m'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qt@ѡCp) q@@ tN tUM@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N'ώ֣Sœ劏r?]{o7s]moq gd\bi>%Eqњx%FiŸtƥCe0#`BW6c+F UcfFtŀ|%;bm@ ] ]<[<#upIυmrNWQ* vNkWmfmc+FNGgCW 7UA5hNٿݑi_x.g@^,۸nZxys1_x{ -^-כ[z szh>paչKZBjMaMvsymچ&ZYνkE_aU;Gv-+LP˳ n/zlZ~1oA:᫿o*n7oWyׯ󻠺\9ğyuU ꆧii(>.6|nFtŀ ]1BW@;vb] ]&5bQυ֏GNWriBWCW.r~*]sq6tp ]1xt(կBW1u-rv 0:"v?v>wve}u w/wr4}7/W{8Xn~*}+zc\B7g-n>+S, G. oG¹Iݱ]ַWm/w}sπ}hźOr?qv7߯{&-U@0ɽ-X6ǻQt8FDc>:/ں.;կ_]n8JLl*l{ $ {&W_V+}OY iwto Ys;؛iꛗ϶&Tsn=쾛h[ubkgs-~JOkgrhѯ3ʽPk#;]1JS+Яz'O+CW 7\1.F)ǸN# shs.3b;]1JNtwZy_Ttp ]?<|:]1JCBWHW!轥#33k,)3=h{Bɇ=1׽=nfsYU{ayq}YZT5\g.鳷 %STٿG|~ wF^RttVgۺb]c2{(P`T(V#K^IYãؖc98ߐ63$ hkVc'-ZudquUM( LhN'{ _;mԣaPH"4\ăzisŨɲg<.hcl0fp_'0ca@$zėarWpIR&ݳLܮ]/؇WWI!k*6WU}YJ9 F"bЯU?pԨ^wU ,sF;\U)"\^l?O(]xRՒo+ӳ]{<}WM=,l245Yc1Xl\|WX;%~j t5g|jv=nU@;=[T6հOڡ@J tLE$%ey?@ B{j_QMz{Tp=L?FF|efn]1Hܯ{q[߽;N/?c ֆIop-!u{A;z<*K/$z/aR*{8ޢdX;qYkkg o0q37mf4Fi`|Fuu35lQ#i\?;)hUD )oiOݖ*X1s6y{K)>U{YQnUCh4 ? @?SH[ඛn}k9񃪍"Uj/n߲/О@iue돏KZ{pu%A-e,qh;ڲh]zYKX:1 "Rpx\IIȢKu@EI X?gZKW4'dEsǢa {rXUZ>r` PLXؑ-R1~"(BIzvE¬} Sˬ}3!GSC7Njn>R/Mfi{G,tL7*m_Y]?;гBmu ;3)^w1_ͅۏm5RֶQy ^NfǼˮei2/SZ4Ë~O_Ҝ|v9|Չr<_߮?gǸDrfΕ3Rʣp-U[B_"QJ%LLԂS0`)HiC@=&C$#C tN)F6vlso;F2٬{־fh2Ա&?rY_μ3ҕ7QĽ\C&C̸(q),% *7ųQNj6L0NԲW(=t6^Lrǂ'AڿV9EI-Үh/[;N~7"_:mPizt~+aQKv7Qn ;ȟv*)zP#f96kDZdƣAs1@VQcW RsnoWpK].Rҍc~b Q6}K+ k꓄x۷`LG_-+2yo ~p6z-ſ=7քs7i՞ILlk0 Cs `nEl"FՈ5~Dp`5a5XMxnVՀ(ȁ5f2cE4B%gl@ )|wic`)bLtŅP@DQ}**FǿNҵcԙ8w;F_>yHa7i'̈́5_f7W4nWj}>|NЍBO]]_Q.JjuUq5ڣuUi=]WҸC7xg_,]׍dgʁ?l~RӤ*YcgJ zA41& .[u ko3K}[qoIy,CH" N؆mG)$B*m0(Szv!^AY$IUD%K,`uQَylgsӭGvnfs?cW@CuhR=1W'y򏈠!,Xd͍F+@P Je (!)^ߍ˄*DHF{Fs§ 3q6T:AfLcċ&?h}M՜s-OOy^$pGkYu~dYŨ J*~}X'-9෶9wGIq& 20(-mYDKRYWkAmQ\Waq6Q6w%2EK=`?*yr QeKt4AkdL{W ;ӌ}p'¥Dۦ3޲@ev] r$Snn 7Vx Ar&R8]P{c,@ґ0 6LIJzUY Ҫ&d!ZN s;CQ|ד]s7b0Q}AδcG6[Ԟw@Y]d:V#R^OI^*4d)c>Ho$0CfP1d*R>Yv\LI59:vg܍-_,7+eLj(x@[s,KRsFueWc}y-Ts=Vu/a07&F֜ف*gb=KV:M>%f F'!$ҙ8wpFՑqq̍kbiɮ:Eu.޺fK Pk6gMhcL 3(R$*5ߚ|ŧ}δcW< –65L#i|Lbv݋_fe84iy4-.Q^5s& hIyL6/y^wK- $[&1+B1R%ER6dQbȑ̒|1v']!_ՑN{[fKt>U6 u 8 SؠT 7tLǧ(] _# Eۜ3$p^9H8/x"jBld0?{^P{!) i,qG#j$|LH& Gl%bKs]t/׿4U+bAd%Q%kDs=1`csQ0\EKpC֐vv<a F =`HXF@z ^hsXd7K)׶K7}r! O"OW]\`?}Oӣgq(FNXt]Iok(b!jN> {8*bOt8fF#\ɣxYC T0s;ԭp,=J :ǧj-u:Ʉx2e(^"p_tN<@䌔Včd$D P=Qj3JJ" LbasNLHKG-єw2]+j}R8 aQÌЊS!^1( H5+J(tZ9әl/L͉- ϧ0`6mNv:Nªl~vTlgRY nd90PyLEB';`#lIĀ'/R+MGK<<#hhc!Y(a,"݁ xܴ;% DC aK X(8X2[MU&@U ;-Q1䄥 T亗JMBBX1J9Q5rn@:\,0h\#M /gF`$Luԁ` weK]轍OcRFR \R0v#D!C0K-`V߹Ms"h8⯵ oD:nzSݓbiN9dպh VA.<{qozcq0<4DU=[,%$G7Gr 3ZMy7{`Bys6PK|Qd[FWJ;HuOJvuLPé*pTEEǸOGq1 <Ŝ9s˭S beP>0Uz.=e`kRdoB!1qV{1gz4‚Sw88q}+mW=iDK}7+Uö6;wOPOq-J,@=\NJDB:8Qy:o:X(e?~gIR6>imw:iFW'/دNɥʥ Ӯ;p8`y %?@V$+~i'?[Tب9U9E,5)X\!`ڍr}b{~yۑVl2c\%чD k*ysWhe BcڍuL9DŽRn&( Ȣ`59lVJHH!e[ȹV8t¡Ȗ–=6ܸ]tKS}N![´.eA{XJ[e*kPd2(|ZS%pXtF"3D j3>:o/Fi ,%S x$ [Aaj3jX$"﵌FMFSm C\\7Fͬul"o#tPTRw~ #z'>)$gd]?HOC_,xiǿp6:pԑ*gʩ\as" 62Z܆г:oEhp>dհxΆl1Ů}0&S deN MU5}?qpY>S3-χA7@7)O%\j灐 2z5[ G*ط8(~KBZ:>P,^qwgӕfAG l _(zۛ'۴o`ê˗ r-i!‡=$z:Q࿋IlTcP:ES,zJeԳGJPέyR͋W^:mPRa<tM<͢.,d5F^݁RDOͺuM#N\]-WiJqF^NxO@Ȓl[Ux_W#9g`#t9#ڰBtN0I=[uUl .#PJÆH 17`-Sqj #z!Uxd!R&R/5eDĔFQ4`pHeMakl)[xK)iNEQ U Ѵ.>R:C؉=7ɛ]Ҏ_lWݍҾ U-9ȒTXi3 VÂ" QARFRG*'(Ev\q:pA`G/E 8L(`=) m5a ,-frs,Yydc-g`qR"!,s ׶U^k\ێ\fU^z&I>(q>Xө >)^3ynj* a湔#m&R#SgSTFQAy rc<0xz;z:q(qc4x(₥:b.##3DNR`8O+'uiޟ\p*X%qxϜz'9ZB( |4LFRƨFa $V}{~/h͢/*hX6CCu#PԶd׳XlNdKx &pUrp;ٸ:LGFk,2K"P2` Q W*%Ո6魽--1n>޷ cﲹ GPjeL{WYaȞzi]"[qI'vYGiN4,z\YLe?zukX&={s\5[?S/Wf<ݰ?/>\MyrtfF'Ck].;O?~EW=9<,?uo?eK߿ܴK˫SI劑=irM$fSۏOEޅQ?_Տxbz/H}w.}m%}At7*1: XXKol8ie?׼ڏLjWV)ۢ/jFE0 Yv|Yҷrlt&#[BHn{qZ",i2mi^Sax.T^ҹĢKˆeW5WO7W[e'DO@`kWZxP;K&~lXjOb{3b&u;` W#dۏYl ]|>햁QءBG [qΏЀyIJF^j㮚jRW0Da7Bj|Tn (&6>\Ps叼=Rv_JL~՟'g"wSg^~wtO:2PQ_~{yG/ΧDU9J:} 2׋Y9L@`~x"Og4ZMei*-SLg̟'Sm 0/Nv|sb}0.A6ξK7[c5 pڳL׻N? *QX]vQ<{AHθn8=g+̞ۙwv?g+ves>ۙ;mXw9D5k.8SW[-%cK _UBNKe2R FLu.Z @tV/=`ܾ3yzr<岉C6IYfZ!&[VoߍD!n no/}'%u1,FmX&҈t)x0XR`E받*l:{Ĭib`v Y96-u-VU 쏥9h sMyN"ħ&Xty #gL*l*/t+-B]'4 X/R:r2 Ll+K*qQ` V U_tV/)w^1mllL9Ey➉%M+KY_K>&U)&)Dz=3έӾF~EW}քKַE[TdX5dVc:0(p)uhs=ח5 LF2L'!tjaUYbc $<ݞ'<7۝ GKx>,>ԪSc]L9\"f5Uʍ !CW s3wj=|i/M\GEf*hHEAK&Wm cD^ШsƧaP F-&gšX @e1s$FUv 6n^}}[0E,o# YC6h=|Oha>9NI@V%Tr_b `5*-9w@YIIcǧRy厲 -Y!ZF"$Iscx|\IdjqS.ƖXv=lѴ4N|(B#ej``F@NmkF4bmm)@rVlB8j B;VR1E`W\.7>v lֺ:u#\R<:!T g:HD;Gta<Kp`UU]& @FN19m^9W"GE3`IIeÕyX 4pRXny,WHܫ i'fT5@j05198{G5?`r:lb`5&@;rD[(k&17 .jX@*Ga.ţ&&}(R9ɲS唴\SLQ,"+?Ѵ"UYT1J5U%2 N%iJAAEcs21j! kH(+}u"eZYK `pğjK%kk le wiDZj&Q~|)eϣ|?N\Qmnד+sZ~zxKɯ}ꟋwWg´E^tVo-{Nf?\tD7g8d:׏a0f0 m*ˏbg';b[OfW7z~99t[G5QW6궹jm畍_OcsIs@ONk/yZz_ٵxnK_`O5)L}X7KR3ܢQܖԟSoOɟI_ow?޼Wo~[Y$圳,UD||>Dφ۫>4VC[pr~qr[j=nPzuۀx{$FY7%̷ފ/p_܈FH `dlEQKH=FJD@UUױs C͌%f Z?bN6K[pX9'(v_x_"-ȸpbL=0C_T4 :yDWyƺqOg=9_>P}70N\3;g6?c[^Qe7հj,y(׷bE%'f>bhV̻>zhW֐k_~=iCY֩sed uAC}OǍjWPk؁UA)TcjXCe.UHoιJq4!+Ag'K)"Y֭QbÌXmh`Q6%cMsosp) X)jQ5sU9лCsN'1k;zm^{`a)&c͐fem0WcAWt dKmUqv%2NPtV96ܺ(Jj44!`{Pjqo0PG{D=z#HR-"P1TJ!omݕr}:UFP?EMrTyP(R(lٖYږLm~&%b t 9 6.ZegXu8ƵOdWhf_B*8RKc]P R!MD/_| /p0ځ> [ZsЯwz//XP^7V?>SL3<}|Ywv퍁tw]+cG ܡ8VSOu$y D:HC#V"%H}&kҰ&S41A+aҕ-7vɹHQǦk!Y{ӑ`:qM%k|1i d{( mIm#A '1;rH6RD@B Bu(XR z2R{짱(G%G(%Q1|Z8RJe, (UFcy cwrٟh  Q* ] *V:WʦYMZ4Ic7HM;]RgL ր i{z3cbs֑LLn4&Uη>\vq M%zSa9Jѻ.ƑO)?G}WgJ82mF`['`;_O0S$!%{ zHQP1liV_Uc{sRi3_dWy6'ٻkEBNWs㦣&O5b /vŭbk U;PżE, ʩnd>9Hy] @^s%@c] [>pIc*o3|{yb7}mR>< Kr ؿ%8ʲaK>^𥳐Zë xhUu#tUn4- Ƹ LmPV-y]RjTB'Ts \U){5OAt5Tu5 ha?4_F.՞9▱ ZPy ,@3tnՠG Spz`Cz j IUHp)BT@ƌƁZpcĜEZI}q_ؤiǨgc(dCQLi[TJ7fH)ւ_~!ʖUn9?a*F1sɭ`_wHR\_ @C(eZE 3ᢕ J +Ju|,P.G{3?29w{0ȏuԪ $N, D_=WuM.SK~p##JȂ\!IIkF$uD9@ H>d*F=-`-`VlYr',(l;HHAu҂>LR[Tب9U9E,}+X\!`6 0yhoZ+mXƖT%->D%T@eXS@"F+C{n=o""p|C!nHdh[c+FK8o3PsN+y|t &B=[QWQ,=z{b$>ro:xKuh(PnV=jzo@k -(G Q"bZlQdx쩢UCz(;{WS@!ET:7*P{d&e@_/^qCF$HJY 1A%#/.%0Y}$ x:YI6`WS7O?lvޕ+ee6캽s"גV $~ISAծD:Sj{AqG6#]?e2M ͻ]77z7:oZpM1o7}t<踒y> ^'k57M7u{3޼̓f?,uG*Dt~-O\Fr/H ( F6sF͵a1j`wqߪmPܫgy9?DOD}K!2rK 3FCBY+9r "܄ 9gLaƵj)$X`T9F$㑅H>HԔq*c FрG!e̍;[ [#gCKÏJtHo۷,nޜPVXѤi>7)cV/MJ3Wƌ !~40uXPĘ!*HʴH(Qc8^8B6C1(G@ u'p&!4R!U3Bm ,Rg2DOR ^J3p8)8wK9 pk+5r6vv6%,ɻ ,ó݊w@)J jpb<˼)^|* a湔#m&R#SgSxFQAy rc<0xr;ֳtk<P>hXQKu;u&U@0FEg^C& qV8OӾ?Tڱ9Ǔ?NrʃP̱7h "' LV{m#s0tSUЗuZ86CCwu/PԶf׳wظlNdKx &pfp;ٸ:LGFk,2K"P2` Q W*6魭Ӗ~ o,ݾߗ*1"*OF~. 4{1'Ml)$qIg?-`.+`U]ʱy1};mϋ_QLcHn_}͓?dO_gݟ?V~ST\ѻ`pvZiY gz\N۞^ ]?Ԧ՗_Gda 7Rh募S |JN'97z7i6\yTOtcJߦP,io2sSǟy=#;sbDSM(f4׾/w:PLDOE_|0uͥQLH)8!U+9rK1NG E`6S_:B}="^-V&mUՖ%z0 { rzKg!-:UmGSx_-97_ެ?dG eէT%}1:L|{M" `v.TzdZ7) p4_bnP\Nn-+xk\:[F ­b&2쐧p}ȇ1f2ޥvmmdD7=(߯nWNMqr8L. \o?w*WgLL|$*A˭[H 0.knp_73IuTOb6 lGՂI1Ta|QŽNB" Ig"T\#9A<^/xAUk8bCa-'IlJ38 H8Pe'(%;N+<8flOA[~02BZ!̧N{ 4\s)c9ʩAknJŨ5]; +7‭i-m=3{JNBs4 q[#gG/jÃQƳlX^kP\3A?4TJtZ}.ytxq*Ek1ëi2W,VI߂}Z~+^yvʟA(~g?Mfa*zpb8E!tv#׿B&nT+LWTL o)=i,sVJ%rfx=;jd2p!I`6 ½Nht8n[{IM ܛWug6Uu|yi˪O;4 ͪd:M20/Yfm:.])rTG*7mCh<TȜQ+r28lCvZ>lyȾjGHօMRv-7˓, -)V^a&A|=ѭs?D=W%]6Zʄ2a^!X&HSE57V"FMsFV;|vNķx4&98F Y\ %D *[\%Qigq-.LQ),-;),S62 =a[͒g8r|A \8./V xImㅲ`z~˸4a-xo> \ʾ{|RU,g!N6~ {mu:.ƮPɳh|?N©(]Ƈcd8ʙr*WX܆S¢%!bxDNT>U'{`pRj5;LTQ >(ݹJm=+!)ː+mH l>َs&쇅 )1$X6o!)j(Q%4{jWOz•^8I@_7ts8WI݊/#2)hIJ;~?n{ʸ*w=]vZ38lC}Vm{ h&Xf`: u[ e7YxnEI%k0Sh,*ĶnKքD֎]/0n3[!r#i\w.dlI@s1Yl*г:+Z7y [j[ۺnZ[=7Xr~nțg"o o%,;q#2G4;0¬sojU͝ovvkƯ>Zo@,D-aZO#y46i[ JӀm4Ln^{4g$\Fn)TrhQ(k%GaCY$XpA0і)4 >xLΝN1G:z=tvEmSrTX+D`aAc )J[#Fif梕i/riC+N0(0xY@" τR :Ѓ4%x j"J#X [v[=q~TAxU#h㱢 x\$ xnV|x [w*ZCL: rjpI$ DzP[&!ap) ɁK ViǬQFYJy>%Kȳy1rvfu[i9x8+ V)c^ j=*&ud%.`/0L0B:fU̶SM|a…ujQXőp*밊a,%V^ )d4&jci[踨ܛMnc1 $`+Dps`x.jťEQ/RLhM &Hr ZZQ "sy.(BTr0bmCJ`SyEsh na˜K]:>]=\zgH1д)@PI,-`.eZ !NZd5h5Z`(3*h2:ڀmT 5T!k\a3g,Hyt^]ZdYT衐JB6pxksY>4`[%^g̤\%DN@4-a[3t @ߑ%xb-y2 )4d1:B_ XO88nS8710.4"ĢgSX6uۖ}my7I@BNQDf"0!f4X &'&(ӚQа[0(8PG/鱅ϨpR-יOvKk>*AV$I&nr#fJ0*(YMMxs}P%=y)6%؉)""Zy9sQ (:XA"9뙗AqP[0¼gҴ{hjR;ԉ cJ+4yD C'@b\kjt#;"E RrHd 3â*V2 0B'LbXc>aT6tu=AFs J`k1N062hVv??̓"Ү E EO!֧ 0 &T>g 3ZZ)fefiRIH4LjRA4)g,URrk { R^\3 $YiNFƅo:? Pw H[ZWd?aLJIa:޼vIҏi$Ψ>|wr0aFLR v!F"r2B'RP;1Id z:+U?@0Ui`?P& t̅*M\ugtp14G=|o&! Xsp +lWr)tv6U^ʰRh)N/NN`ED1~"4ͦ6s 0^A"LwT͓k >.?;?7^\O [Ĥ>*g WR=_Hg0zl{PHڑ_?RnR; 06$0Oi)>n cՎJQI֍Z7W :IPHq8v5$6N N\P*~Q-_?"wΜy.K*g!9ARu|(ӝt\ENտnQUB#LUNbwP  $/^/.˻/^~{x۟_9x.:#A{}>W>bM UCSŶ-(nr0nԔkquܤYu}E V?_DPsAC`t*H_v;HH&LJu RHx9S(98無]^:jP ,Op‘h)+lPN`dz` &GdcV*z ?P־iQ4':Y|?ɾQf(o0ZՆNPNN`ND dGUf86r,DkMYNr/X5jQ1G >2J#"(Xz]X+\$JGƦnǭJc2~]6N<m1*(O>ͶTm!ZjdkۍFR`1 2ZcJPݟ`UI]Rt)QVF[: `y *_|t"m !oFp26p50CKr.i`ث!zC-Mʬ~^ndFVFVFYIU'[kՈ c2NͺZ%ѩU@U{V:Z&W+VԙIɐEj#YtkrQBNVcis$ŅކXχdhr^q(#ΌgFN#0Y !ecT Y*rn&Se\)]QP(eܸ64fg69&Q,XpJIq6 b꽶;Pjv1uvCy:"̼8u|lcNu8ިsu "$dk ɠ P|Kyűa83a| vUV_Vz|Ю >eu"Pzնݾt4!.bܒXg#)ڎUF:{tjV$\L"*FiF:q`࣮KsaK0Syӵ\V08%2 A@E[UbHEWUtcLv$ԵPdɻRHR,)\Hs̜(+.dggQ&(1:#(%̨Jep5sU*fĖ$M. Ksgy"z':͸=ж64aW E\-Z@Ja+P-QEi#,ȿb|X/=[0)n2 x$~wLs/41ǎmG|Nc(Gpq~vR;LH *ޑ3*LyShٿwӍ7Ү||at=lns?_rg|#.Mq.K u %IR̒%ruVqƜd=ED„qz}_mȿ6ٚ?}({{姓&XͰO>hC9]\ڸY?xX\5{o~6?z|G^4췙U}Jik־A\rkW?t~z%ӮjGm}]vk;q9Βe$+g<:V˓r)ڟhLMpG4bښ¡hS"E/Z1 DWxTM7^tV(8\RYʊ2&x]0lr>]&S1J[BaͳzquM#z)mYomulLJkǢmW=` I=-lI~f -oP_ۯ{NÄv4jpd"p骡4 a=|z~Î\+*j0Pſ+oO?|8_^l^n|.РAddê{oUh;|}ȑO' Up_i߯҈N[V[%#FZ"H|^,OK'O.u0M /`r#߳uؒĸqc6&k[~=>;߶ֶ͗'帎r?iTF~F!Z5W5 bLv!סJҲVY?l=Qe+?Vvq[?23exL1hI)xLq(qd{D&LЦqBt%%X'CW .LZNW 5 +&CW S+Ak;]5t U<'EW^qhUC ~WHW١] `VQW s?GW -骡$qJ`<jp ]޵)]58k+ke] `gde ] Z5vj(qN_#]9TrUTJzeNW tz)M `]5n2tz3vVLW<] `o 6v2OZ}2P<ѕeY6z,?[nwy֏mܱV|,i5 }:]@|?W;N6gZLNGڠU,~H@y0pԡz5L}_['Ĺ$ 6-g~}N>/ݷzO7l ן6h׾5kr$;(@sB76%i.F8L[b O u-$ni,yX~m WZڹ|z0pҀ;Oz9_K/]4?$`FsrJ:87eG$o`" hMOqn8ܜA K0_ Y360U@c%"ޣ).9>4szmPDU}5VS1@%BKNbt?8V N7Y>WaUtՀiXcoцE{/ Vgw5;O6J_̞G 0^pmxt 8enď47zimy3AGXsIC`%ҙW8Ԣ(l\0[ߛWvfϮW*:#sۛ"^ KabUZ5Ø>l. :axZ\ѺЕJy 8h4,  UY)Zv/ *NTvMGG}=&ܚL/[mV뫏ڼꚫٞ; eW+=U|@*m>emVZBB/IuŨDۨS(?n? TLN %ѓ:wnp|E~ǧːg_?f5gau܏gk=^#є_ΖQ;X?;&7O=vϙS2o?-ǧvexArem,KDM1"d.l,A;ʌ]͹ƨ9emPrr(k5lŠ5Pz3c783c?ҍ EX\\x]ŧispq>OJ'7!,?.mb^&%%R2OGqy}0>\|4 - -hSdM&;U Q5O#z4落canq kͬ=+}9袤cAk,|t C!L*&cW>d]H$(Cm$NXpM.*Y(Ɋp?+e$..| |Nꗿ\1&2̈zfęRX=k !\v $KE.r.M5Z,{JʈB)ƅ 5J 31$NI2i0&5/Ό C‹a۹4-JE̋0̋w9 Dx5EQ5dPJrHI{lD| ZqV_Vzw6p>lxSJ`stF~c➺'{J7eՏ/Th h~w6kZ$9Da0Td( U ک 2d_%]x- #* J=,Td#XJU3o5TоDRbBdW\Y9ƾOƄ@PP''(NF5{{npXxŦ#HC/mH C{޶l R@qo - ^Ø, EalS0h-Μ9sg>uh[߾k"M6.dmx-=9:.w5 THm`0N(G;A1D q!8Tc͸.x Q uH~uW:e)E*#R"%1dNF03,*i%qp"`ȇpwrC= c_u Yw( (qϭA ±1 bʠiVZ$?evcM#\Fqb YnjF]DZIH4LjR4)XԻJ.U%vYN3ky^-&ƅo,ſe8vfzG:|)WT e6z9޽y7yJZVJqDu$Hpf烡0 v0mH:"a=@H D0NS3E.Ẁ^PսS  ]skB 3nWiKX;BFE"{q\7cワJ'$?ĔDlQ_U%::.&K9NSφE 65{)mx49s,yLOVl")<6T͓j ~*w~*oΪ kH.08-֖ڮ)Gx<+~w (##1~HMÐaؖ o>%}LNhxY/tsp1mӬqTZOiԦbLj"C#i`,1/B6Y"}xS"%?pKBTvKq㋑~PtY[]J=pR?h t M<`Ӌe=WuQBkP1LUN}1hq;s ߀H~O߷/NN^~{:y۟^97x/5A!p _vZꡩb-(nsՂ0mNa8—$Dd(t?( /eI&;iY.6RU>&$#_DQj3JJ" u*1bJJ'9a(3!k/DSHlK^b_p h 6(B020bPrGAdkV*z J(l:t0OŁ=qOutLUʔq_ʤ lU4n;Y[֍ҿj~ן79\!rAn)1)r=$s|̭3GJEXh*0E@heDBNN)lfQp@G @:m'’g2D'Hk0$DpCLDK9 Q Ƶ5png&NLJroyfNnucs |Iy<5xlO邜/\(5 u[-91,B1"-) ҁ=1"7򶂼%ٔ $MnouR1opX'o$&X<:Q `M(>>NT|z NWS'Kܛ:f[VTҥ*.W2:? rY̙C:<tAuzeH~Uϼ%ه,JQG:n]kv6|)zyjC4A0ɵ )I7\ 3f4j|DHbaJʻzn}07}!pW뻳"dh?EfLM7Z y*p$rM&>e L=aJBcB`iIef-%i:Z K4h1АGsj/t~IMґh(x!\1lpNqG VSogK[EnNKcT` 9a3cj4U aU8Rn+帓Fv+t,, #M B@Q0Q(8>d"-S>"d]J* ;dXnui^c3#J.R0v#D!CK-:cZa,N5cgmI 9Zr;Afxg2LW‚ᵅ<sk v yX@p*L}^ wb[,1V@sqڶ[0<8c1{(X'bf}["| o,yeG,xy'F2 {1KڮW=DmU􀬶-C,'.+C4FOiHIUQ8FKAq8F;B k<UX-zg՘IFMFS,n PjxvX8&SF"2D j3>:o,x´ !*خ[rPP:B䒲L*u^o (nGľ8śp1e+=U udn|%w |pff ~4Z:E|-eA E VˑCY`ߏ5<"[ |O4`泍ag aiq|ǪRB;6kY׺xg [J81bZɑcP D0&&\9ce 7U=#aR/1",DDr1"b:@ #(H8RSݲ5pvtZt<)5g\*`?Nfu\|$\7Q#[ ٛ Ǣ*zޛֿPhUKq}ȓw~=6_vt<py ^}q7хQǧeijU`Q^uWhS[}6dV\f .߼| $Upl"5Lπޞ&o]] UXb0oa(6ieh ]vQ`c=76Ύ4t ײ7Nhޙ)Puxʁ>amKI,Hyp.xiyqx>YT]m7/˕ ~֢bKEK*N,N]xuK!);-\ * rfFY}>8DY|1:{OE5I^tz̋ fa1O&p Kk{@%C3xΖBP}Co5E:c'_-犾Thd( S)5;3V{fQFE+' ƥN*m:TAw'L2(\hiSXS<{jnѣJIWp_'d4e'16uqUZSK>]o!(^}Ɂ*(JNYŴ5ʗ:MCHrWcw痩>yu"ݘ~_w_g>g?· &xr>`{Kݓ2fڒ_~~7k}ZknVM嫺iZbK59+n_}2ztЧ|}ΌG|]z޾&nˋe>0còu]ǓwGÏg~?9=[~A-N.g8仟̋_;*URNNZKS9|s䓉<Б M1.vD'WC*)#yk;DA]wA#ll wRm]Ӫj"e/iMm+~qL 1 X@k%XA/*>;F)u:,ҠcO&gsί4zd&K/<_󻳓;ze. s/=q~W#ʱnm3BA!2#N+SJӜ#Z_RVQHwjwݮ꿺J ͟=@{-|m֞2-IfL+SI]:H݇HNLf;+,v9s6Yil)3U2Sr)Rʖἓ`MrdVZeeNQA;tNIP|T$ Q`шAD!F03yBi"nu9YW~$Z"U֟ʯ'-S/uh->|~b"P|o"aR~ꐺcF)3yr%g_ 93LLIU٪d+:IsSI(sǢ|FSʰ@Q{m"zT6U8/_P>-!tS,CXz%c6DiR X0=-|HkANPGи:zfb1Řp2)lz1a`K%, k8&H%[(.s] ǚٯ:֞軧6e[uɠ6@$h^>qhzMV2!ZXs1C):vǾ*cF?h&{6g#B"޿bBv㈩_ AO~aK)¢OL4CBkg 2(t - d2("Pv}I*tvRjYQֻ F%{i E<U>09GyWǗWe \;z$)zfyhrOR+MbpfyC솺_w1<$é> -km6uk;c Έh}~5d[ni2 WeIg$sP1ȗWz’0Bq˯[ۂ[^=z#[7[s( PEΨ sFo|L9%X%:ڑȎr(%GAGl Qkr:\Ϙmԡ-d+NLJ6 s(+0J}3l_i#Wl`ચ PZ8vVkp ikRE&Y•E wkdIίÉcRBMM?݋&yL=q[{޽uiw̟pNJADV;ft[vyj,oUoQ*Kq#vWru9i~]׺p1ZBÁ9;mA燺y@wbH;Io#..LO[~t%9_=Ǭ7y1=-Ж`k J1nટ􏟾ud +C%R]I\5csCarZPcgrJej{w KNq:gYWR Wp\mu f<<>pU482( U5PZkqpVja\=CRÁjsWlnpUT3+B!jWlPZZmy•kӖءBП\YuK>|T9y5<k} {=,E{)Bv]JBTX =%.ݷGjo|_}<NRIV :^;WVϬF!v.cף[۱>m5O]ٛY*_ do~r_}#-\Oo~=×+r]\Onzd})ߓwNiT|1IZ-2 vZ19MദM4;Tp N: NSi*8M43~O1Voaj*8SqSiMu*n*8M4Tp NcdڦӸeSi*8Mq˦Tp NSi*8M4sW{?`|Χ9a..:: o\)[:Nt Օ,(i$FXb(CƳ-V".:u)E*h8ubj(6c,sKZAs]Yua.P$b[ Z%x sy aGGgdgjb9m_e/'˾k/"ܹ躖̻^.yq%/)NjMPR̋kO^H]?rqOy糥ko?AQ atet•Į, -V$&btc9ؠȨ))D'֦hɤ$-IK0H ݢ*a`1,t  Wkz_3 3AfgqQw͟WZẌـLhU.d'A샐 [*ȷ=@@AMQcИ d s;a]ts؁{0sGp~(XP{0I ڋk2MM`ӱ@fЖl*KpHg  *IESLjP<,\K!sf(YtT0֤, 6jNPIu6~z̹{R\)8N?lr`D "YGWJ8+uRTKIcfʦ{+׬ K V 27΄άڵT )DXtҨlcd& h7c#3~DMGg"49l00.Bņwy<Γ1)TR-UY"'fP9(Ȟ)5\| \ vxjDg>ή~w6 y __F1߱w\~OJ; }6%>L5SkW+k}oևdD YM;r/Ygra'U@RO&;K0{quϼ}Px` 'j$=TYB }`>X(\f (ҰUᑾ#Ps&4b"/SoRT(mvV(R)޾i+vw.=V/3WUꫳ 7Ǔմ30Bfp]R%W]_)NN'K9V]S#U)hrZ1~|p%umu"0J*O>եx: o՝&71)k޽$f`>;Ɩ.ld$ćofB -QF[b|}KfHc3lC=LZR>QLhؿt1o7M6ti*A[vզbL"I s`G2?UQ̒8G8Uk+sAQ\ ZJO_>zEhdzH\[Ӳ,=C#hj?En J5<-!9SQ&k0)os͔'XN:Tf6f:/ZJ`h&P,2JШ@CrTح9C|3gҴs9..7 8Ql5Vq¸T ­Rii !',u!|LT!$@GAʃ63BF}'H-v1Vr4%dDD`$Luɷ WU f&>& (s@P851&Y˰T]҄HHƀ3!j .uK+`Zצ7A˭JR'lAͦXZu2nc6QL犙e+Trrݒ?6]}.B6FW}_ tʮ/0@.^DGY4 r)Wz/6vԜ[Hnf\ҵ:nLK@~ѹg0mP zQwl.,K]R= h@t~©:eTJEǸGq16Ŝ9s˭S6bePz/jKg Z.PHbh՞s:FY^1Tk%iz ju~~\=J dCAz--V=\g=(\K^DT%QI̋"){p@} ݙzߕKiA`d>g\PEIάNvJ:t?;9 =uN5Kn7eu:0Rd?Ơ}/4eN%gVr6E sm80kv0bQG"`@2"b K #(H8R)ݲJklil6尐~)6_MmV/?R8cX=Wz~Z}m&Usx;JgL YEL+m2UN:QN"E"$^ٸ `x@ۆ `13^zz9a ,-f۲ӷgh$z 4z$.*5'%6WSju G2xRZNȃK"\+OMVԌDtgNl)xFɅDO$K<P>hXQK+:3 D"@P'`$w0vɓq+%8XrnvwbdF-xx~ǗY@U;Ϗ?ӱ_{^'G9G7wuZw.'ȥ_~ nRɥrrdVԦ٫kr s5̑ػ|WE>e?YN>hͭ!R~d靦K_V/]DJG''j@9dz%5$"=eC_~װ0tK˲ rᬺTU2[rҔ߽. Fצ(s"hIu%{挶*kgI[\t]x~ĥ3sy픫25Yw56)Mwy}Aq2UZ/$Dn$*m.T2SZޤ4.&˔d_bnQ\.- -%$޶ڛ}Ҏ0`Uqf9]4:އaiO4^)OKRPu9Ө}sV/J\VqrSMv*N:Lj{]iA5ժVD:Z݆fk91]7 6M uG}4UVUZ}Kiy\nckoecr*ǶS i7euC0 =,ybbKESCj׺HT]oA9wh^˜*sdfʅ; L@Dt.ᘪQI:[[=:===d6Ľ_-?ןIW{{(0J];wԹ\f=2{7j{& { .u7{| &o]cMTQVdfCC={N ;ĉ6Ļ-]U.tǛy2b1Z`#$8ə G$*T 7Ju_9-ԔgjڒE 6c߀90":'4"0]4Ձ9c U2 3(r]'Zz]<ϋ0)O"tkcE/6"Hīo:ڳW*hEQ*#k Ez@6 (=^1ƙ\yFt ĕ&"J%r&ɵ02>_Nv5]]Cӈ)Nm!PShGC^E9Ô1 P57jbԚخehreqc IĖPƽuH%o9A ['W)Iǻr x{NYl8شR},:x%A@2[.>^ p{e ›0 mD%WQ+12gԊ\q& [䐶ll'+` |Ӌ|e*e+la0.O}oavt?[th\] FKi^E>vϛc e & ¼&BP5;Lj%bD19@:Hbw3e#dPo<&/Su `}ll?׃2ݙhvqqӠ#@obO{R:Asm #@RGSJmΌ|̓:rfO=4-6eu)W`?m iiF3J$r9%dCOK &TP~2H?,*t>ܟ}nHܪrYP!WH#hihl1HE QҒCD\"A TF5]kor+?X]U0HE @>$Kv~ZڕE}C$D$cekf9U,3dTPkIP̱MWJ`*UC U9*hJ׾afeVBҺ/u K /=''!ümvsmZ]1(WRwE-{^ftVYfwNge srɵ9z-K]2Z-5X4(M5 1`Dȶ8ΉAmdK (5:#9=*m]3s4BM7szhK3dt]6X*DJT9\S<( {OO.kSUR'?x~FW_=SvW%Zdx]ˠd:y2>bj3:rOmpRUyٞ竏o& ,Ju8brKJ:{k瓬X}lߓѤz2-JbPOM&Br>h5Ug>쓟W!J_JOxYM/.h*Y;on5/<7|aԈߎ v.Խl{a6:?}*|aiB>_l ^lOa﫮zf(oB2zjtNi׮ 9vNXz:sr5k7㩗-ZMuXjuWɚeZK1C~RmmsEvt7;"U]{7OzoO>8}r^y06o;vo9|mgdtwv/|mɹ]?gٶ(B'jyfi{6(D &=iw,'Z'b%rO`0vyu@=wIܕ>'?-gW(0{D>?'~8Yƣ2 w >7YK~u?__L/TMd.sy3pr^dc[e٫WWW־eȥL' N&_/S9懓Ӧ9󅚥yNdHǿS{nIB~ J.ox@i֯i%o ;o;}=Syf.UUWoc+1ؿ@W\s4pլu Q4 p!>" ljw,pլ5/zpE:\V U3ٕX Jp%+to[^t]+*%&@+Mɐc\|=]\~/_3I Drp|RBJ :*B()r* t1Z. 8\u o&%~ŭ۴_ݚ2OӸz~iѨarտAk7TK:4-xP8O;]{Z|vqӜ>%] 1f?I1WЂf-qMB5p<p)L>[cR_zE:}Gu)uޢJ.$%˷d:]brGEhQk(P,*MT0tSA+yjš^w>[{W6JSe[%Z8?.PZgjg_]T )d+1 Y9@FGPh<[kR!  KU3RUJva5R[z (1F`*9i +-f a]'B9EB}6ڲ˚Me_ 7=9}{n=ew5toneuaיG:{v5کKޛ\׻|̓ We{ZQtUB2`9#WGB3|⛯i%5͉xW x}5)ĆJYjV]u(*:R> SIYhtG2KqxTNpS=Rjtq./IN]P};mi㴮*B [7E9m&|ۇtg\`VD9 lj @Ԟm21iv",RQe1N|\pQk.&SRf2+C 2DD8egt%::ԈI^@* .`l@Z¶QEȦ| 7C]t[xceRQ.&p5$["FMF*/Xrl z,9T`C5ʽj]˵bb $^\#0(cY)`' tTAu>몌}u2&<ېdL eR1oˉrumMIl 6S>R^"D:iZ܈=NۍBCwkɦ܅gcvZN)^.%r͘Y%leEYFmD.ap5R0"&5dKE!~%y Wb搴%mWS"NDιT1F+?hUMz3g;1BĐ~ i%mn"yC3 -&rώn?8}HНՃ@}^hDyyoBӑ(\L[M:_}AIKS#U]- m 1Z eK0$Up'Sk=ӉİP3:A* "t 3վm)o)Rr$1Q*goК!Mİ7svDh ]Mx~3PmԚ(B',ĶJ71"S&o*մ⹑5ڋzVj!e=*o"S~%`T(JXԬ19GkK=jolqriU*_IeqI9k.KytVLf SE_ϝxy=z5:_% Z=JT|zZS>*& ^3_cɒGeMEeMk8bԘ[I$^8[s1*A=\}JvR%0:0* 72fz*aaoq(X,|V,]ƌ׬ޜn.9:I7v@..F8\mBD+U,DYjN$Bِ ڱ! 6c Qpk+mhb/s:"oh([Vv䋐JUɺW#2g;be^ڽyǞͪ_Ԟo@lEEUP\$W[+v%SEw AL:V呲+VµV㑅NX\T2X%#Q0J^1W*.6ݧ7sT/r0?EDa@7IX1kms}`V(*U67ƚ㼲bRƒh¤̚mkÙcbKI4%oC"YG'<i7/g\\ܘ 3!O*@+Efdj`QPJ4#7\|\<<;C3 +sMN˧u6(y4b}{ܐZُΡ:fF{h6s5Kf?b%,Ь4+5L`ዌm_yߞڻj;ס#.œCUXС̯x{{p2e fDiUzDmX|6j)fʶ9~s5xˊ4*{Lebtc[ :*k89tZ+zõT!es:牽O5=}ҽuMdZJZ:"yg[%H E%}AxuXPuI5.CgA9wޖ(N6 ˠK1R?/7xi^ *j!`̭̬c`QzV\Dfj ;RWlHAi xCTTAz.T@3ҬsZ"4~ݝiZObH45|urRˇ/'?{WƑH}T_]7yY [,SZ"$%9xHeOK6Y]]:fዉ OqOo[NS{Bl%9ݼ)~}qnX4fj47ܜFCT].Bm>AOe*zMԳlQc(+rr$;Kwd6Gxy/LG^k;Ȯ^J)^g0{$w,p0LzI*8' /QEןߎ_FW{I\][۲,A=aǡW foHT\W薡J\CDž]& ɏ߿ˏ~_L|w?{GGI ?OrTyX^3m(o~z/% af ̘v=) ge.rU`Ju, 󔅜(˥!57h K1HIwN3ɱ@mT)m$kMU>1Uio i:P%˙,X &RI`H F(-z iMQ@t[Bh )kBt:Q:«No6~DN7\6c`>Ebҥyy-pH%Jc4E47!o)~+Rmו2p61ᡓOYu^Ϧo|NUNt"l+!>ץ.&H.rf'qy]_^n2fZEb\, ~1 \S!j9Z8UH& .~@nb;uF?e@AYZ;/2UlZ.*&LG6w4e ()'7r "T//ceӀ Q󼞯u;C 3OF旦xӱ${Zl#J[׬sIn&Y;ےɭ&u+f+vҙ'meSE%Ɵ[uq:dz.<fE[ز-;m{WZn_d vwY(wZ͟!ie[K~tW/dMYk..^ce.^b@W-mYqd=ݒhJD(y^ -`wFnPRg}\~yMsп Lp%xLIj LH|HZ@S{32`)kLHڅhp`2VHu M팽cY_^M'T19ޘL@eÚ4zO\Ol įM(uٗp4NkfbQmAFpJN,Qq08+B !fa / 7x $ZZG~rd=$w0mctO$.}b"F[L p9G-߷no<,.AlnS^~73D|:H#Qa{z7G\teP㛎o;L>qZb A(1VXjS&#O4 .'1Ea»QL0N3HMXD 5DHyㄉ@ D"[&Qp/q,wɓyݟ;9{%ǫ^ .sq*h yd1C`<8K ʢ ]_fWuZt]G!z([l]|vg'ԇit>hξr1 1YFi7=A9Ǔ3*FR-sJd1x1 }oi_-7v"Ԑ8놞T~uyݬ`K^ܩZ91 "H~oWju|~>~KsxrL^׿AwQj'6 w(]x6RS. )٥ NϿTZ2]$؉h6 )paO*7yARhU'I-[A9seE4VEh8 K[gOՏV{uu/aLG`yɗB(E]†eQ|وwꍷ*6SVu@U(IO$H멈Va5Cj{~w#.ŝ?;=vvdrgS:"CN^w[ɽ mq2T럧<-3Cw~`Eta=GxniYn& 阠"vv[6a~V/&)+ˎQA-G~.+jf;`׎\{rg 5\mL<%G}@(:9W\LDZH|d ?kkVk5cz1ώHKsܩeB5)(8qʄ'kTQ;[+DF|jq$EY9cR3KjSY꟒G\xGJQICMk¹?g%Qf:}>BІɤp FZUB2݀ B.K!"Jh2{< :. #r=TꍰS2FԯI@ZXh.P፧Iqq,<ٕOH !no_cR~xp~yB1VoCuk϶X˕xCzAbuu,Ђ^e\enMd.;3wlՌpL()dM=uzتސn%yʳ֭ PUfV*(xyund<+<+Um˽ ws|'\IWqhF{:F;AZR0IC21)#kjSM.:7KW|^gzPwY?&hTPU'Ӝ|.K!++"@*պ tQIETK7i4K39hM#Ldd$AbON%'2Rdxk#$jmY˵pAb"&Θ Wo+8/?|jl$Rr] HTB*{M3Ii+>W)Jz W*'e= W|mw_/uqpA%"TG$CPܞqU2[!V'EmJ wt:4?{WDZ\ qh宮p"@?EAПa%)_+{NR!X1Dr9=]]uꜞگݷ!gg3o$ξͼawoh FdeAc \b ZC71|0zH۩]wNZ7Y}uz^SfɉWʯ '?__\^\o&QyV֞PW& yO~ĸ5nl=PNt^/^\iBۦW0188oǏm`Wg@V_"Տ'9bD3>󆜈 aa\}8/;q̋N7YqyZNAk4x?kQ3k8*r;ڛMj$}%UWݪEl-<{,{RuDYTJ4}yijڶW*9#[] B+U &mJ'8CO~p< 0Cia99=ti< ޹á9gBW@쑮!]Yo([Nj7,@|gHWy]]'] ] -ZNWs+bj h:]=3+ l’ -g} a)t(ҕӸ;KW\ ;;Qt|Gb^ҙA^ΙWv5]:PJգ|tȂ ՀRjxf9UDCc^I|99fb}#L ?X.?׾ύ1tCD%.h$- ]7 uF7{N}I>9[ԋOڧP8{Е=Co,\X ] QBW@k:] |oBWLfIt5娫7.F]xt5PQ]=G(p\Z?HCBW@!])/aAtbj}\V2Q]=Gr֚`DWL38:h?tJkHWϐuv?c c\K6PU`$`Gpy1gZ'NWe8]=G,H=|rՀv@#tʳZ܉Wv}[|&g]P3n;g3MXtKy&>r蠟cyiz0LKxfe>cIvtO K6C+3rgHWV `6a1t5Z] ] C‘ ]!;+`CW,Zgt J=DW_Հy)t@)G3,@We9twEQW->]#]=Ý_nOWY:9ggc`Y0Nw麼97I1{sߟHо&Z52kg{ e/%cPw/.JN}Dy~` wVWHqtNo~bD gw^t{O=~S샿61_O|Xey}7=&kOޠR}IqT7񕮺}NTb* 'aLòo|@l?O?fc2?}$~!NbytӇ֡K HWmU]wYx/WoU*ǖJbz#&j3Ǚ(ⴻJ,U~|p}wM}xz7z|wea`..W$g{bLjz-䘔mOR2N9UuvgfmNY(465fO!lR5mVs+IT0wMjT2_RtvV񭛊0sn#=ŋ$3Z0$D+1Aܜh}k5d &@Y\ &#H#a;qzNS|n-B8F}\}j*zW-5[v!2^36/QIjQH-FKwANQ̘n#d34f"bùk"Ŵ{)wG@# ,QQ\jĐz$4l\cݳ] edc; cP !FU){x1;-oMh}GY"pGm5($v,!C*d2Fڋ#Y1'u!g5W,">5ys*M]ֽ*LC+)*ftt@r=RNP ^ᮨR saMѵƒBZRHqu4~kkQ0!46ZQk\.`5 jk*dH,8dfm6ѓuzMI2g틅K  bM7ਧξB1!KutAf,-Ģ;d{j }m.5SGߎ`&dM*ؠ⢀)+>,d0F3SXR GWXZCq!jh*L;ub{EMJ#-A7 /M*}v@ֱ*G$W{H B+uXFSayCs ~z/,,tF^8') >@&= :-2֐0i1GXi0/! O6AJ<o 7tX:?jP4lUԝ&փ n"e w*2}wm ?%䋺 蓥>B2 ,یKе 2"XwPS:=8Y}L!%:.x З9tL$p홻vH H1 S{ X4k q[O V 3dЎi>D"+P(Z|PS`etY&4Ō޲㎤%߳(<^`EȚR܌5v`m,RgVI3J+1A2?AjDao8UKaVBn}HjZ`g0hYZhZk`ɎBf}4CS3&=]1ͬ"j\a=9!,6\B\1bAb~xF>` \8 z Aݬ9jʅUz4\%tlsvޟ=\>Ji:T_Vܩuz믯^؛koG aMKufYcL/Z)RCRAÛ([D-v'NouU}Nu=p2XNLRB r6XMNa4x>|,t~o sχM@o'za8prvvzʅ=ax̿{fxh}2֛x >j϶-n. ;AV/)u @>ZF= //*nH /!@[lP'@D! ČH ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ";="4"70J@R[I$1@Jq"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""xI HXH M k|oH PEI sD# d$D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D t$U>@@Ч d}Ŗ t$a)O$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@C}1GZoo7?q)z1lO_Mr;gy4ϟ fkZP j%W]P$p(Ϫr hBŮ@->g*5#:BrG{dW 0B@dWk}v*%:BBh{dW ؈U+Tke Un DFW[\Mf mp+WMwf?l -.%nՀ3-Ϙ'?wd4A Pg@frke#WhQPhY8{C_ruq*d 6Pz4̧4kr7" N 'eqd|86 fjptݨVÿ.Tۿ~ &ڤ4NOےxdyu'6HuN!?DQDT;? T NCGiWV-^"L x^v["^0OfaEyIk*U3^kk^-dTef^,@tT&}>0_T0R^= / 9A/A߹6VcSE,7P3IΉǏT_tMuRFormÑ$/YkfqԲEᇙUmTb:zzF%-ZOH?<`kف_ôZ}T*;8G&){ns=+Los]_ *uBJ7iY CnBVŮ@c+TٵqrW+oa=+, Bz~*':Bj&{dW('Uv*ɮҮ P;U7vj]J%ȮЮYۧd8{cW(Wپuv]]92=+,]\bWVΏ]JIChW>LF=j'AO64֙'N;i]@yb4'[,]s:BggW__5/J87[ ~p|:50.U>[cJͅv_/[{'ԟ~e񯟦Wenז> }W!~H-5MC[qq6us46¼ߊKf!PyI: (mJ!HTͪhJ pun0|u)aV\O'鲹V' kv6gB(38tI᷋*Ur!d2[/0 oDP' w\bJ{`9.^K3pi Ǜ͞u5͡Q͖=؋5l݆wwI^.XGKדE_ovnvx{w"j:^lƿ  qtJzpiNg%A2;/"TBJ+封 it~QJCe+1b |*{afCkM`$VIܷډUU3 w!RhHRmZAD˝59wM5Afݬo3c؊)!b?ُ6zەBl Q]wyo>om^ Sr…9|sV1|`ձ/ϡ*mbv?b}ϻhlO7ם9 T@'C2Lȴ*Ia4\'E1O i5к ߜl$Ac&Dx!@ \ {?VS8_0\Ϳl{ѠIfl{*t9;AV M!..J:d'T>0~6࿼ wl=Bj+wݮѭmX,F_ pmgYgbn$cvw:;}:gי3廒=SAC1g椭S5]-[I&m%CZ)jV0[e! Ȳm(53Dx,1 '&M!=cJ_b(FakJ2e[L8J5K)wMϻg%8r:^؋62Н]܄r'i_o]V̩1+gD.0JbըF !gv%*%tlqF7sn<+^X]B'#t*̶<ރ~|ү7JqC@ xjnRRE*&{VAY#M:o+w} }wNWY<߁&moz`Qo;3{E~@|ḰgGo6̆sl=&ດiFc}BV*ydM Q4ȖXTd5YFi0,bd-yڰKUl|J"[%wFRk.!IJfUh*nF)K }8熵?z! $;Q5i<\}Qz8q~o#t&8]5ǟr "TTNb9HZR59w~!,9aA]tzb}NsgYoz|ƚ˖ xvUM{xO,}dT}ES85CN0k:gRRdj5h8+ZuFl͹ȜppJ}OA(r(ժ-p#Z &ն3& ~i-Vi [k]BO^Up6x,&4a41߸cTQJEX޵ƑҸrCK$sl]$7pASXQ4lwbwό^=3[niN&*CH^Gځ U/,NvXv0({.^(b۠+Zب3v,&Q9̂6CF fgxC=xԖlX`t6ee<&]ϤI6dg^ x%D$aPy*vH2!CEHdML,( #(ed>Y6JƝZO|`G8JĵCr5\E!1: PlwQ[ -L,ͩ&ItJJD1t$eMsE.rGdB MZ99*cg GޓDړ\%[`rQ ,(G֏IHN9|)ePIҠLnE6򕌣\|+pTyCx<|L%0e:kȍWq #76}BޏJSFF:/_*LIl\2Q2fAAL/TExys.j\XޥJ %r k " n!D >bI'y26z?7^b)M_&Z y:;y-)\ѡNXP!O(eY0ȤTȜBL)XnQZrcn{A JBk 96BcY!.5$O<9IC;d;cc["ijiެ,@'%LJd1؉\ ){5,+b/eULPc9sǒѐHۅ;V&<3V{Qi؍ hM'}F+.X#$`bb?C%-!a I0ÆEۻR2ůǕw}Z~{P!]Vo,,NUX}k|ה@b=aAI,_9~]u⦑Tuvj2{04`xXiyK{ba3L읳c8=73io~Il/ں{(Ҫ %R :KkKv'I8jL㣲Ϲ:\O3SЇ:>}]#Dkn[GBg7?1Rp^ZQ ri\"ΉQrqQelaukL]m^LkkcK˓7=fJ߶:G̙8I;`Ws{8RMgӟ巜4 P\,h45fb9gOM. -*^o u^iGJǒPH_S5U|W0 Q ğ jV?_}|3:\[\+YBz¤CeOc_hfgmaɼ  *Zm809?MG߿ߗ폿xwG\{{7te tT`ga@0p?y 5 nZr݄oѯC[ղpba/H?>h.^2'@=T܊ YFґf*t$^F£J{&zP"Z+Ѱ\qEA9@:#}icB.aY)NY#O&./L8VQX$my|Fz_&Ng;y:nOΟsuj] zY6v^mF f-Ppkծf%.-3cao0+G \=[z8qS:&a,Ӫٝ~)K2"//OއIHF`ZbMXH8@7gy =t/ʻW:>L.OiEpPzdsNUc ܗiYɋƽs` 灡2wxQi୍1l*Z&%y{^{HZz}juilr}JUNҴv SUo$7r{W {taڣm2ynqt{-l~p[;0q.Ӂ#ErA!IcVGHHtgi$K.n6#:Sɜ$H\c`[XD 1ׁzEW5,jiM*yZ)a*Dgyo#D{:g.1C4hW5rqg EG]PL(ScsXZz,[wuvmqPqѝw0b@ѺtxK[dRmE"DY>Ѻ!ZW}"/HnNQ)} f$U3 zktL>gpѺKi:D펻AdBFE!Ah 8bI B♥@[Fpo!z# JmDzcx9YPcKGg07 ҙM4&Ƞf; 2faz3 1, :M̌P=%ST&\d 1MێV8 a6&IE.kR8N2^Y˘8ROX`9G7+T){>czA/Ȯy~%+֙gF ;a.N )iJ05vScfN!۔Np`6%|Ҍ=AIe>#4_ <VݴDfjIlbss`dD61Qh^vk (]&ڶ/8]q:bk?:CM k=G6& .&?teY=7ZGȆPY{0؂]]ma?ֺ3]5]I;Pɾl%0ڒpVLem>\d箨5I m\4^AMkj݌tN.l?g!H-kj=u{f]҆-(h~=\fk-#_y5OyXop{mV/ckKs1Em&KmmhE 5vB6.+JTq)]kITrS*.A)J%)nk#EO +ǙOBE:u`u 57zM`w |FJYg?o.~yMs0:KB& )geג0 "iL!8g= " (4ʌ)7>&!j (T.u*CMd*0SځEcg؇ BMO͡K7i8g`Nrn٧bu6r7M2Bϳ@ycMe#ZC!:az!lR :EgTDxPrG0  m9* I"Ck)W20'=G!8k̠BSfGE[F <מKL1pLAw0v6 ~K~CysBdD'Qb_?Gv`͠=^%%H aY³RA,[oCBN;Jqw2lfކ={v[1p(`hSECkvGLewW.ƣ &e\RE/=8D=d 3itVqJJF1x1S#~imĸ 3_1qjfnqrgYoGǿJX`WA@^Kkk^,N&n${QMwWw5+C3-?edddD:L{0 tRHҁ)m;keɽ(kWe}W r] V+~?4!Oܩmۤm/z乲EMm.l{m |e/h2Ao ,^}%K`sc,ݫOCd'=vabFϟC*5u^ e*4K T"hqZ8atuF*Bjy0IOgL]662hf-ښ0fP+K*!/d SR 28]5hˡ6B}f}րP;-S҇v4igrz{lxo*K!*+-((a$xilB&xS/m U vlkQ me{)Kaj!svF" _4emBM/6&g{H1]϶!2Eme2{7ѣcF8/fUgv1No* 7r@.4"ZɃEj2W˛TƤ,#s(>[iS򆳲Gy0o۹ {(Hs;?;^W Uy^/|Ak@5h@ҊWn7g?R!cCw\,>_<JxN|xڽ-Z&|9ph~3:q{, mF@j-U"ޱr`SԮ\TjH&Xh;>"q5"R B2 fd]\#v*F+R :j8ɺ"Og-\Z/4*jrpJ5 H3TT~Nz ijW|"&\MW|W(X/oٜjU~Q6e~A Eu#>m;nc~-.BȮp@:cJfv)]Bjc'|CPm!5T֔))o _ dпw\?fW_4;W_!7Үh;+V w}(TdYYj-^ʬ^Per8g=4ȇۑ7#M>J%|7n/'PӔBvZbOV4e;fBSji̔"d9(*\&|1:ݫ%ZȽ6T';m\gq7悙 8}"/7{fjh8Iq3Ij'_`]wl;{z nMBA?&Z$Si#;s=6 .mvzTHlpr\pEjWҺ Jy01 W@"&\MW{8mQW$7.BAF+R|qe /:+|+P&v\ʎŚp5\A0#\Aa Hn\pjRTd]MWa#\lpr\pEj h\J) - ʱ6ARtBJ *826" J} T)jrA:ox3O@u`"<m:;jߦ-x{sDw_ߪM65\rBc!Q>%i#]" =]^JeA3 v5'W."6Ď+RDq + lpErAqu;HeP WĕVBb+ W$J."cyRe NWF%#\elpEr+R &v\J6Swq+W$ H~3*]lE W/+P Pl|WVG$dprp^Pr@\\dԎUJ&+֜+lg+Wl".YWSU0Z N'(XKW$Wk."}([$\MWΊดg.)w{Q^ S<1֝[^Y73>gz iY5V"Rl Xq@?hs"Xslw8 bKpy?c2֍tOel'zB:d+ u $X!T"J$\MW Pcd]\BW:,j:VY ;:W ;H WĕVZN+l :ԎJH) ,{d+`+k,\Z='L WÕJv'/q^(jrq~IIiE.?/nNO/?zhN^׏:o^%vۜgo 6h FU-վ[:[8/>Ϋ?f{]Rjx}f]wgWڧ)-sQU(x-Ҵ~͗ {m&!MlY-[ۧ859 ,??[ G :Xw?vԗHʟFH3su'}&7dzkpu^ջkAwaE~ϏB;Bٚ(k>[c l:PmPI*c+l5JCٚ mu[8c;+gY^8+U}?7l8i᭫owFm;/lo ؙ5|owAr`#;Wo̊iнk?Zv>m}NІr-`mٟzYi\,VYO;?-6;PC| Gmc0dí 7\WI.e$Wͣ7/ރ짡&ʫG}u[I6+61J$q%`+ŃM~zS^: 2Y](\6]l¿@_YW^k}יdj]is]t=lwdn>dmlmc[zE}[ V ᷱnMAo/2g6C1Mol&[MoMO}բ_~mNɬZ}B&];ҶMAB[}gs4fcЫ}- wXl|% m~ю^קt+;|ljjxɐޛ{ӝ_]lvƌ#1#@ ᴡ4ش-c{>챹$4*GW^hi<ԀZ6nOHnRLrLMp86F9-UV刮t="Vٔntp &WB.TeW, ^*m80c\BrpPW|\0\T{{ $᫱@l7 qkWhMFъ[,.g߽ys4[>me\`#*jIѤywQ>c3`^=q/\Sk8.zq؞gCwzg/Z:\yJĎ+TI UppE.>O HW脫 <' k0lpEr]+ʦ!|"'\MW KW$fiԚ+T Z&\MWV8#\ 5\\NjW'\MWN WNJaZ6vR룷Pw5E\yipRlpEr+R;R*C:"t6(FB96"sW҈)AD^W{5 R;S {{a2^OI޽l'uoľfƥnj݁4ٻU}FQ>,y3[%Q^PBf&͉1'"{ehKcAnnyL%K 6/N\5nyja[=UvrNNܮC/8 w4K6Bfc^jX1T:p5E\)rp4 H\pEj]p\J&+_/xlIm"\tpeu` {Ǻ"p .v\ʠfpFBN6"\pEjWd]MW(P Hzuc%2t W/+gbu25X6"s52@R} p) WG+T "``+$0Qkٜ }W p5E\9;i,Ej{-pȦt$"W}&$~|]/iFcTv^pBzvcAkGj5O%DQ=]^Z-'\`W${.Br,'r?y%\u^FoW$,\)녊޺"F&\MWI$ vڳ+R|"8g+E`++lIm+Rij pE\\%\Zccw5E\Y|0 ]JHWVqE*A%\MW΃4NQ0N6"Jsu!v\ʎp5\"E\Zb4:jxm4Q[q ¸YV5}m'{MS ƺI}^E'!kXƣ TO ,uVҘn Iqbޔc 4!CMUFdwo֯?u/DQ/me"/3˳EFwt-uc!_'؍ߠ0id/FS{_؀u}f+M ,gfg^>b$:$ɱim@e*?\?8{_O I9v^fB}fVWך{7@S]< =t͟_|Q櫼_Q^~^sS|}jɲyY5<}6mV޵qd׿2ЧّVz0dKQO0Q8C =fCZRٶ8+y[SɊSv=Vz=o!ΚOlIKpϗ^_p'v+e} lJj߭zxōzlb]HQ>0ޏu<  16_柯<4ɼICÉsLv@mYۇ7~_%`m֝ؒCIhz#LMrdgTnlw!+ 8?n4cEtշi3w<=+R[7{cFW첥aQ+j%7Q5sLUc!TQ*3.-iSUjt7xb%^Uc05dIZR_Ǖ\Q7ʾuUQ*buXGxx;t*p}sb D1ҁݜh}k5T*@Y2LF$FQdzs&EMrY{wyI5UK[[jn\u ɚ %ehqjJ&QZjG-:ŌiZ;B5Ccv[\ؘ\t5C1qZ䔔ջ'@ ̑G}iu┩zY+Eǹ;ІRA5P:iʹc0 +>|KDcUYjO٢0ZGy(P bwب"?ߥY*ק.L{qA13X2d]șcq4>5ysUUȫ;;]SI{k <ZIIT1۪Tw$# 5XVS sad;F7E"j'QjShі8:FOkM„k9=c\.`5 ja}ɨX8pj6zUIXO)Vէ@cZu)jԍMɀ1ޡKutYB4BF׆R3uiLZ%~(/VL ]d0`g,a{PQAQxi0GΙ68AĹU Z.: gC3cM(mLt%OBJY`(. -c54ˆbwY55i0\7(.`hJ֬C7*T.JvtZ[YzҥLcs'[FrըTb0VjXq21P&44:<) d> 8׀V%&Cij|U2TS <8Nh&``U %%2z3|Wh%W e CX' @CР,Lh=4v&׹;f[*C(]g֜ANŚY kG !.A 7)ؗGBa@Y\Op j[9(y;XGg2~.nV!7P`S1*E2GSF WR<Ȳ *E{eh_u4 : C굱[w2Z*LR5UUuA"fZ.{ffr 9qTCBb|9srZIhQ8PF 6A"j C$+auhޣǻ B\FѤi!2(a3RU@ ǘ@Qc iD#NX&mئ0]7bBz:n=oAtj]`AZXI| xḰK<<>o;@qU29{H\!ZUF2jx CO`YBgԅs {jN4\`2*U 5tq,{Ѽ<04':AJ[SH܎m }Ïn WgUS ]_^nugUc(7YKaVEP;> VGdm<~:;|di*YrS ,7.B&ȈGA]O9s !/f90hTʗ%@_>1gmD'SaʠvoКf6CܖSelJRjhNJ@A ע+P(vm!Y5Mh3{t-fA" td%5vhά$1S*j\(ͮBUqwqpD쬮pj,ʰ2ucF9P#YfZNX  snqt֑pipl4J›rɎުj"`!-#jw$aU&0l3p_h&]+&z̃qW t~ c8 gGX5`0u1vMr63IҢG:4'ན*٢bqֶaXS"^y"<4F&x313@99KlYeFIJa\ÁwâDluHckB[uչc]fMA;h Cʘ@%fti  Pp}va`cW,XB|]Dqmb>iH&c}?E bZ0p'Z[T1"@΍-|0#p Nw=us[`pO6uHuy19MC)XYR@ZxYTAljP%n3)e3JBv tmr4=נB]%4>xZz%@N (|uժàe6¦f@ F\\Hm??Д n FI‰45Sa=MGCXm(6 qLjN 4XoYk6ՔW˥!X`PrAwʃf WrH28Ĥi2s7 \- Х56\~B:T<!GR~ދWn+n;.*3l]2dB! 18ͻpK`wx}pXJ)-Էe$k:uݠwk:p76ƾV./֧ؾ«wO\\^ y۽~Í: űˍ=p< rOXOnhCv{[綿ǶY|Hg8oۭ_i:^:Fhvަ >8Z] 2 YqAS> =ɇ ham=q{ O~|Fgjlhje:h~t5P(t;X_nNƼ(~P(|{y^_cvxc~{C?n.HZc*?<;aǶ0~ev[#W?Zm^.kKrˢm].fe&RW߯燼. $p?w:lѺ.lې0'oo\^ح^pˡyBYD?)Y|oXmvO`35vIzI ub+v4՟70|u m +Y|%07"Y;DL<"`ap~Z;] Q%ҕQo}Dt5:\n|O +Cch p@jt^%#+jXjejt,tjx.N ] ?w) BW/!I]0QWBW@kU|t5P%tGGe#w73@[t(]ztٻn$W,0楊,.` yh;-[Iߢ$r-O>aYGp%ޟD&2ؤu;joRU)MTnઉEp%҂&\}DxM7+fY<OggvM>NxVzQg:#9#b7t fНOeGi*ZöCp+rTЯ}/D;J{ɪ'jPχ:km ׅXj;Z/Y?l?# Y2VV?c B0^4a׵mP<꜒**]]&29V/|H<z~;x n3 n_7*Gx8S&}&4ikII;''09s^#ҭVjbjRu)\}@2iʴWW"n0z_Ipդ>"\V\5iȠKӤ5;]5)a W"joઉv_IKpP:#&2ؤu;OYWoW~Jvvo 6iujR]}Dނ'2(#\5q߻6Гuj+?_7+ޛ}JdIϗ&.w%yWMJ7W;S=[_,] $>E'D`̟Mܰ7Hkk+I CtCΟd-<ЦՓ3Fio%!PXѻP.G\;[^V^X]wBI wH^UX2ͣ~ӼiQ$UЯʪ!/Aɹ ΫL1+uL-߶/b@'XG+s^/4ze?րnsw`e%}½\?S.Ws~xc|?'n2?~q5]}_2?_Ǐ+^{K> 6{/=e٤+E[aZUKyr֍X`5X_ȱjΘK ce@5hZт\l?ַ5ԎS _ew  lmz c&"e)ZMI6B#dYSQ.+pW޾8M/ڸ,-"')N&> Lbyh\q$gԍ9ۇo9=,ڳ5>NWOgGY~յ;oj=ubW51@QH^єNӣsy[/^ur8eK?R>#BT:S1iP ]sTD[kV1kUظCF=Tr)๪KZHFo !Y0}\55lĽTLcnֱS 8jkov|q9!B6{mt<}̡vt;zUqvy=˶/On}[X<ϝ+ۣ?XgnqПr̯ 8o-..hrYFdxU9F-ONy7ouɢ]BgW"δo Qk妉er*?T>쑴Wh5m ܷޚYu~CȀ.x$=[U?ZbἧyaSPU$ G/5*Z̻/Em[C^LYs̶4E5/MQs:xgtJSFj$7Zk^jV5TV^\(·AU-D uV)={z^Gg*O: M3"Oi 9ݕ^GI|KԪ0:Y9dD٦+x]C38b kVK"62kcTd& RHZdW`.⃮%=Jo<2, 2XX&Hr6@N$df'LXt"G3DC85 +C'q6h,rX/Ƭd߂TJ\5 XhS h|]k&풍i1'/#ۨi! Zr >LqS2!Ճ퐍M8ڮz~MHր6oE  `z˞1kyh1s]zo߂Ѳ7O[[r/yL le89ǿ; 0^߬>]ΚL8“Fx)#:Cd;*u)&ө[fHV j^j[֬&tSQh"Z*:MхCΦx VSVb6D."&_}PfN:Q&%/RoǨ7qvD]<]T4_f77tSyߧ{z휠;͓룅WHZlĶr;grӎuGP .*PP] v'T83x?3Warw'WYƔwBvVmm0@6vEs&*S)}=sdvmD:0ڦ"BXJ ^Eǚb1`r>׳c؛8wU\])td㙆# LuBK|&E0EB.\Q\ZԢ A؀WCeP>\N,XϨO:3$1o,Z"!S K^QP{aC܋TtvG._7rc)7׽oMY \E7bHZ^#^ϯx1|m_ LTtr6T- jo}~_(ndekj/d5#STXK& 7"#_K))(j }!P=Z#* 72SW {ӌ]0`bJַU|]q}x4/7]Mg8b;ERJ x|q%tچcRZlCly $cMMArP WvB5X}0bEYڽi3QUlo@%Uk4SGڄ\%]u 3&h*(W<ږ!gAXtk NG(9SXTR>ܧ7q9j3/x."Q8 {lK+ƘXJhmb}ړUFX7s󚠲xńĠQ⨩Zl&o#!$ST^dd >j55g}2W9VF`%gF"&,ڒ7ō|*rsNUfR̟o:xv*28jQc?GG[C b ?Ƌ"#qvmp(Fyw٨,V/ /δ|2&!- V*sxOV|S>O޳q#W6-Q| 0|vrbq.gŗ4kiF;3c߯4-k,wKQ7,UŪb=>Lҙf *GKS P&0n:ϴnpܘ,'pZ\[0X!:u>}.a"Mf۞/p[D W%Aت @6˩?/8-%U&i޺5/51Q6{yrz:r]0LI+s4lm[q\&>?ơt}9I(Wm^f;u+FшdDMR~b_ &6[\+^Bz CO*Z7̻h5n&Lc6p S"Y|qr-EZcBxW. &K>.4/vOcQ?*MIG{IHBzIX2#K-s ,%)7Q4JT]4AzumlX0a\MzYXջ<$ǿZ/*DFypǂ\C lp(]xmx=*(\shTLI#j E/E.wXDRs6KS܅iKihċ-jDF55}1n!R˖ $#9ѣD09'@y\$ŔSfoX؍%R:Éo_;Z֦:mRwt=SB-9&< sE2H-8H sgϚϟ͔75N^fZ=%Ɵw>Z:yKAsQ3e %NynbԪHՃĥIZOS:I,6OUYiXM]?Q״ oIڶ~7 G-V&?KV5Vn~5m2/ŢdjD5ErQTZ;FGx#i!L"185l[͓%Vy@$'T*'(Yġ\1W{Xc$TOCjEoWtJ߂IQΦ72Z[I@Ǔ,w1Md-8tc/g(gb070P4֧σbiiIolR'|/=i M :Ǯ _ Y3a ԵL$FHSvݧ$7)>u/X)$Xuז( ֲD!{QLb:w[ÛoƦL'[78!H87#Lj%S,[L& Hx:)$pFK=Ri %ЎI- k!,0mJjddt8[EJ[,DI\L"V&mU: &` м$.zZ&= tI |_ C AY*$Fbk!:CЙ%ED0=^O"0y_cckU:h`"'")QVG@߷)yOX9EKA;*$1et$SiMLHlu|}\rӾ.W&R(":BYH$.q,"[%Ke@gm`A{1b:t2,m6`VwL sW'0@^zsR0F%arLb1J=99C8jFufh۝ DC2dk$ec0ea&'*Ƥ1#',$X('`/u"T`!ӏxo)=&LwHww9jyz^6z6f@ԑG4_Wh:kJZ{ NoT}V yH+gZ}}y;b$pr b9JxԁNI[Kj(vG,)dR2Ij=y~WajRjfDG%"Y9xAgf0MJq?%]J(HbEv 5GH!%om5]Ddo-:\MK2ϓi:/|vVڲ5o]E@nm2"mM ,V]^["|VSE5n_Z׳֛Nnt5WpQuwiyjcE+=W݈hC]Vs!o!bOoxv ~k൵DM\|k^O@23P0p`[ڮʬgCqa<ݲF|ru* B΄*ݾS;/Ci&Cy|KkнQ(BIdY[b8JȂkQKz!DrG6 e )Kn}Le4NЙ Hhe62Eθ3p{ xss(=G-胔k¼bhYjx?8$p^dnjVNx(-w&VT nt Ko :F $yF˕9ƒpWVg c>"5xXm%\'xJh&ZzƄu&IsA;N6 dC`RfǒR oq!yS PynvS^I鮥iv{z;D\9`äZ3P ~}Wf<ުL!*c|dr̉"y(cCY9e%$K9(`!3OOczZYOyAA!:yDKG3E2gBAwRZSu gO޳k97O(BHxDC|J2 .cY 4f]t>j$yAbڈ껣y7DUѢf?0tPwv.>ّ/;k;w`X91_=Ge岳[Sl)εu~v'6Vp[ǰH-JZe qƎLmF +UW6.T!aO$ssN܂3?&}Ρ-!̑ByaFaVŸ\KM~0PC#K@6sijn [nNUs ռvVڥOE4,CWdrnr_ݲGkM(vh3 4 jZ-Ƶ5hmUuuUn1VJwn=S!@ k#kC=hhrvwXV@Jiz? ᥊} ,;˕)k !־u3=2MhñxGhlF+`l6+\crYU: 2`Yu+js:'?\xۜ_"f@#6O[k9T8\k.6dT*C=CZϫBxɡZRˇ J\8 C=}lvo=y߇9'ۺ~7?\|yrKhi{I799~IEnWZׯ.2'ni Q{٫@w;չzw?ً.Qw?[urR>qq SC?Ŧ{uN15avuta7x0HG~5L-o'> Bz4d+L 4zm.`RǕtj2;g+#ll6r*y\JSp5E\YrC Dÿ WV:D-qBɻb.\\ؕ%HWq3.Гjlprk R Nz=D/q=xɨNg+jRL(|4*D J;rZJWR2v5E\@DW,W"ɏ]J*"$_[kˣ^]\]~G5ggwͫ ۱q17lM;GtNjn@=j3ڿ&7:Vzj(XӼ3hL^(eW,m$l=Hn)zM]ښ9XĘ7! [2P+>&ȥlAVtcWbq4:ؕ|p%u+QkBP)k2:`P XU&\Zw%* &`!X|ƮDn>oE-q*A`p")d񮨯 D-%fUSI#h'-X@ nv pݩ`==m]N-۝(9r3T0٣ h`cȃURHmU m۰Ok o[쵸b%yu.\ZUҵ WzsY w%rW+Qiw5E\YЖ(#\` D.A.bh u\ j:@9"8 \5:D%Pq?vPB66\ZLWjrhCP6.+Q{ǁ*AiUp5A\y&#\I'G2Xm RZق *8m2 d+ $2Z J]MWPg+rL>cW"m.!7}]2v5I\qbLI oy1Qf`>EÆZXoI Y g[Nj.~fT RG`['ۭO ?N>Hp0NI&0H-# < S?%X0i DJ::De("J`W"ט\p%jQ+Quqe+.d+{uGY`ʤ<(z+\Ar•& 26\ZSǕXj: W,W"0\ZJTRq*J|ƮD.\pjJTZ[p5A\yrtF`6.vQbZjbe5.qgJlXmH]TPp5A\WM-YoG-, RS]ڼk3Sku'1hhkٱD)h$8c9{yZ:R a߮ʩ =L.\p%jIWj2U9yW"8P6bZQ.6yJT) bF*\\ D-q%*=\MWAWfq%rWEej WhW,UGĕ=w:ޕ !\Zk\ e&Ǖ< Dnf|nq%*\MWl9Jt>ȵٌ]cSqE!4QVg+ xWֻq%*ĕG1"UUp5\d5/ €>FV*|;r~q!-f-&qt7m/7KNmEܴIJ_V@O~Vٶ /mwtO_USVE{׈Z>Oܸb~s{uƲ;K?M[oB^c՜~rvÙ5dC>}?1n!Yק{|{[o{Sn-Z3>)zq7w_㉧z {?z,FN{˲>LpY ǵy?G4' G'>|Bn2򒽧?>U_eV̿kVPݴ~8 'jβy_t5Ѕ6.X֮&numnv.T5_ }7lv)w`>pi}^g7p?nV/~Z5fpG]]/cJ.Ttn"ݹHІBkAjԕB ()3ݵC=A֪H| uXfР:&ZY0j輊j=).Fc{?:l i޳x2UjkƆ\_U-PвSo۪mBMP6r/:6Zm|muيv6nZYjՇ.٥fK]X;TZ+tزdfO [U#{M˲W^4[36غ1]k@CjuM]U| /Xfc3/qKulЖjv[/cٚLdP2;*&;&  m.Wa m'޵qdٿBa`,Y 3_zڂ%R$^ }Υ(%#vxmփluuQ"OVvU>1 %o$Z4XLզlR\=D@`1zԼXt):Q} ]&|J'{XMI1!Ku/Xe]!(dGhOM6deGގ`FˣG-/VQqр)k|FnxґG*ؠdxiЮG_qLڏJ$[PyU벫Ԕh4yֲtXBc}`1גuT6 Ņ5 -cwk.4˺ F@B+GR7fa=gA;` UhU%`}5 eow@ԓ*eX\lG;5mB7BJJJ#L2b!]AА^6p. 2 d> A2&d]g(MJt"T]!z@e gS d/XmJU %;.j!ՠPwVrEdܠQ { @rLBB("2m ؍ttg-JC(]eԭ9+ƒ cD[vThE޲A+}ƢBwD"4)YC c~?Xd:$fTHQMfWbd@BUq@ơ"2ΪU%? ʰ"  De#@ A1.'J sn+itԖEs64IxTF+ j@eV o-JPti[ #^mU($`F}+>M-)K ` ڪ6G?hzyۼ\tY <Ǵ˳żm9Qe,m#]5 Bz4qt*ih$ti#ll% f}nPp$JHuܵmhT=kM!JKBq$RVO ]j31iGrza#[*3bO*Vh% xKd]ж+9]PnDVܢzA,WR SAA ,3 R4Č,mAz |"2PAzπzT [bcW,XB|⯽D4cҠNn )I [0j`R'JQYTPcD&7cQGQ1a`Y;p`h\t%i#*ՠhAgМۦwZ,~FO -jQTAlRԞK3e2 b2 D hs@:'/y~9kW!*>@MVkA֫ѪxdmQi  G` aвMk̀| =peR\HfhJ7a#Ƚ5>dN֞ +JOQ!,iJ M\= U7 A.* 6FTS.\YOH1ˡ옅jI+EH28DԅKp$@Br6Bz!(L ΃rt ?3 ՠ,n-n^,V7KoBc [6h~ YiޅUO(N/޼y1eLe@׷!JAۓ_`7flQ4lHsZs[zq<}36_>:.~Л.s{?a\|)!W^i/A ׫WR,J~jWg_?Z/\o76g|jO j^@{RG l!/ ҟ =UiL Dtv\+T@VqPjNctY!|R!; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@^'RSSrYn:N 5j2N u@@ tGEȅ]uϖi^u ^.(gz^_n8{c z髓}q}xyY.,@]J'2yɻ;GOBnMLOB nxvy6ΥݤM(I1D[y͋_1x 'oƉ1Kl1G⥯N(FzYYYxV/58asU`iډ ;m \{h !lL/}?y=}|r{Ӽ75͖./q;060'.z Y}bZ膘jh[kQ'y=Mzvl;9Voey誩hQJ*i'u<|GGmo4}0E!05*}1NF8krn@u~Ngw|kOMˢ߷̄=T_buҊ^[}JX\g*)^}f$ۣn/e1r!A~^/Ի/KwIPQE1>4/ř`տ=|yz_O(Α>g3O5F+u?*bg:arV9h?>j(iveZW|{>g 4A볺=6KXi`ICt?뽬ܭo9;oPQ7*dW~?u?,~;47r qt~]Ys$Gn+ y5y͋:a~4rnr4rYU}]})EhFF$L]Cj]7WO9>`zmӜ[ n: XT}}p{ŭM|ĥ+D̖?6}彣/ m\o9cO?[M/qժ- NKYh)KLc)=5 Ϯ7Cˎq1ϟQ!?h9}zkgvGraP\^߭tG5ֿ~oZ7$+$p=rv*Vɯ?/fx -Bg'|``^N}@>D Oz 㸇'7oק3X<v _~M<{l&G& 9Lى T9i%0>L d)mC[/\/\3]OҀghf PpQ DZ8(nx,T֝G%,?v;O_BWM'*eDR繇4^!@` h}4( bl 懗RNyЋ^fo1"FƒZ ȂAs0I!G,QJ9 O:$ུ $C=2AsI3dɩM" =o ].N֐}y9/hhXhOw7G{{&v~\88so)m׎耞2z 2s?!H_ʂV{/aKl *ӻ/M} fmG%Y={ZoSm^N>jNʊVTxLﮮ6M\[AӜO|2|5$Vx}N.v_e-k-on{B/z#exH+gMz:&J_HR!;%m% be cvhPj y]vI᫳&4cgJR?p3š r̔aZjdɝ4 QJPVxgB;5'BJtJ \g@E"Ykc!Wm̢8\oÑ6,%選hUe絃\.*5daޚNIbJ+|YI8WryW~Kzz,^UV&S% _UF᥽\r/^KRb1c09IILB>93!q.ɠU)*RB^AZeA>j<ːq"ӓ\}ITu7v]NNZ :rxX3-><_$9K^6£Uuğ׵a=cW?(qr5ˤ-Mgt*|8O?OF]}O?WEv;.d۩n?Rn: ?xWJFhnɎt:wNi{`&w6+]u[t%uߘ֝C+]]5w]{ݳYZi=}Fy5x?6k#wt2Aoqwwہ;"d znkD43'=yxe-Vc^3ڦn~9Ѷ>m8i{Hp "_y_5P]_J TVK*\Z=@}5P\Jemt|6!m3mot]\}A>_`{/ldP(9kQ,ZhM7邆w,<ռ1;GwɩKJqETq; v~;pV;ԛ}/)w6ٹQa~'BZ2/, κBDp-WzgEv `^0XIsE` daNL: EȨy Ou dcJ`0S)[%cϐrJw`ģ'N퍝 C n7;>Ǜ6& jA#ousTvWxc8o V%f0 9<+C3$cA; 3$`13o`z:M`y6 V2% 2D;qfX^r_ԤjmOp\>% 'oȝG(e%+gd\r!Cd@O@1h}ZvޠەuZtP>E}{vEFL0EAIЍ6M+7/=t0ǛS*~zNVtp>\}:]dE4îD[r7yu*'͹RSJj$'O]Tw2$=(ٝJ]燐Xs^]Sr#`7̖KdHZu-3Em ]~>GTg<;/)ok:IH'K~$/)~*ț]m%mȋzBkFcn6/?i -,OQεP\IgVSPrMW*k~~S*rS?iarq~_=Hܖwș=].񮌧'_P䠢#WӔ]e}z~Վ93 #Ristυyg~ӅOG5}R[:w똠 (X]1r6|ֈI[Y["eߠ˚_#1qN3X9gU9lLS2c86%03a;*{k8"Uz֚!KX}vYCJ|}C=m bƣOЙܤd|,X3)z mWV{ф`@=)8ʸ8^^?V3?ܿu{7;0YZnzڳ /V}8Jx<(YuOUKzxi%L3Is ꐪ BY㈤s 3zv u؟UI%lI7eJ+1"gi3Yf郐,o}d%y,2C ::7N%Dk⑕#KvAL U{cg?T3Z(~ 8a8p2n0]k;ڝ[KI%h1Qg2Zm2TdyRA+Μ&iJs]c _*_}]]B[% #4(h1|u{XZ'۾v.hHkbТ"AW9\Pʴ!1Ы󗆀i >i fљ`S&(c2he"g0;G[pkglC! 1Ua~6g;$yzJ`|"G3$ڰ,. os,VBhɀSp\`w9=Jzm6*G苝g}z\m]N_y! S4tgFk5o '1%J;e~,k\c匐a~0-{&.ٷ${Ҍlߓx߷D(y1 i'cTc(&{$0ݾRNO-*X}tW!e<Y#hC>7>pr9P9bHZ,cd(tC{޺59ӕUuauN) k0ŏ8!uu|,j@-l)UR39m:@@q0Lck2O` xrc׮aqU/2#sH`o&pr9(<.1N]dnnS_2{w7S"ݲ\kN .NN/r-⒗S啼Ɨ)fh3},Bٷ͕1nsQly5|uxe` 1%od#>rnKΑf2|Vfr8yI*ۓ7t׍F26d'ewz}NF[U{]׾r.Eǵtٹ}byin8.7](Fum_Mף:}T2ot &&Mg'P|q~BZf!Ne9d{|ͫ_S~?y x?_{+FH[A LOw!@˻IUkoݵ5G-]ޥ5~oAV~IT"~?>^2ԽRM?d# U6_$dIT uku4"9̲&c4^*&GtFzٰV.m'=a<#+R8" z[LxN,53!+Zf,4J!pSLg2V3^>u׌(w9y .jpV a;#uZ(3hZ);eN6VEkMc<71>D<(GŐشgwwZ' 51{J޹%ŝNA`%̨ͩx)@:\!N}v9WZn]C{_&F*qqd:瘹#y^3 .\6XY\+O<} n߇ь.Zq EOB*Xϗ/FTyR!94#{(k$wnգWgr7umv׆nVb\Gˏo6fUˏdz֭iLY&=W?|,u2ɤzզAC؜OfRI6*+-?^kㄩ#q9 N"Z -0B/[H5 KiHIrl묈N]:#wC8h@Ƽ f-!`Ѡ9e-35fZ>t.FYQ:F9_C_ڟ, T$ޣBFX̙2D+9u7"+c9.h y_2x덹v[y$w7pϼcj<+X5j3,ѲKȂrUs=Uڪqf(myVM2'!kÂFᜓ m8]"YpCU`$BJtS 6[&lTQc f5\U#'vCHCu83Xi+~@4p Y͛}HKgcj>\ՒqD~ɨ]-jҒ0Xj8ddLˋJiw %({fڇH29s)P $&q,3c,'C'/׻rmwE􈦔f;!qRr͉`l20FΖq=+&F]0M<>~:9aoh2!I%+ŤI R zAKbmpn!*f(.! . wo < $$VH-Ș :kʂZ *:OsM|W|_bɎ1QIĝKMvvϽbRזvKxr}зz_j̛ʩtfOĭ\`dÂA\N #\cހ!HJ“ U5S!0О6HYr=E8*) 9+oK;#=ZLJfF5c=RMV}хnЅ 7- 29]6v'O|1?UΛ'@hx0P 1DFQpu VQ{c]5"xke 7ISJ^KY|w_ +UBԤ2pbDc*Q+Y:qɮzQT֋bЋ^5LH 8N1SN%D:5YPlH&"w"RFA/އ^}X;vՇ>{Pa 7k~>_IGnUjj8D/GFR'JLSQ5jp*y8<R:K,pRSUNd9k*MKnEHVHXa%۷he%'Zy'ZDI"St@wy V! a,Q[!DL{`k!)тR L a*ǹ DeKm@cjlI:tB%RQ^˺꼨r+O<(t[eODNR0JZ9dIEQ:+p)'A$rQ,[Mr[B5Q?? ЄdYB)kSɔC'K'2FՕ3ϲ/0H#rP宎e"+ɵuLE֠d@`L@ɼ9[UXnU]naTjޗd%8f@]bՌNG?Q10Kғ 5E+E~d7B&+E-eॎVh)!* 32dAjUZq1NVl_of+Y06i 6B [GűWU63)l.#ЙbI1 b,$ot5YkYt<"="D[)[҂B͕j,Ae[lc=0.5<(!qi"Ve!dr2hA<UjtTp,ܗ=5q =vgPƪ%ln0=Rl@`.lt:5Ԫ L'ے\U]u"rO&=f_BQvE1!c2U$Le$͕XQrkeH +IлB 9r!X)J4LDMRI(DzNL9[hgW xO6 m^ DN AqO"rP#ȲM\d Ak *Ţ 2z8A`y'1cdγRu)d9U@n8> S %IiF>愁b٦X)MB&׻4} cPY̦ay@2'/u4U@mek=fie6#kd01%4UvUhђ vE*Nžx'1Ⱦ=}58ns Gŕr[4Qoj1ӲjTE)Us7KVBVIZgj#P^兞<|-7gnYQK^Irkw7٥iylݨf%<)V.9d+g*A9#$b)aӁ[0>@Q}pU\7m=6l:)GhQ=N=Wŏvhi~&s{/ɇ^)U>]Ox+jX?7"}WNNrvVi`ZAvq`+`*Q|-r[k .Dz.:<e,iƓ`0Rq%d:gU@ƍ΁9 88*hOIYσo^i(o~zW2ItRО@hoYvo~'ZLgO'xtNg,eY :y/ѯgqQ\.::R^úrild0+u#[hW̸fusHCh"wuw.K_Xo>Y /vQMuLhimͧ|쨙a\4 bPx4']s7 KW&U>RݤxoNR)IԊcgkRD&(rRٕ8yh]s]P0.wNU6먍'r6hu&_jUWhzsm)݋%;=c;VWyWd 0(gz_L>vWz-^޴ j_߶{/SkW7Ek UkdNI)xjʽ%D(/;Xh{+W,\ڐ}}^Vઃ.]y\\@ Hs&w\JXWU>EXP XnRpEjqE** ЋUT+Rk +RM vW蕬 FXsRpj=W5\uWB-|<=Uebt19]3]'!ވћ_LLjnf쿟Ҙս^`@ waWT[z0C蚯{O//ZYgN߾xvdO 8S}kIߍq4B&c\eҦoj=[e,_z]}v7Rjoc:S0;Tn)׽ru鄌xF>kӖmhFvEۄGril\\vGrIjh*MyBH.cM}V  6L\}OB+Vઃ2AcpŻF+WV]JWĕ 2X&\\\Z UBUqpEpr+VL".ESWDc=  v,7\qlsXpA\赵pEjgmLSb"V誋 QKZ+{ ?OJ,d$"Zo% Ipr+ b]Z~!\uWsa+Q34\] "vvwHte^[T8fT|; , ,7Yj7Jк M>RUC5zWirEIj][Y46E1W걦$\`12)նU2MWĕU•UA+#W,׀\Zr4M\WN+Aո$y1bm箒Ԃq* \uWJXp bpErVRpjA+VJtE\yC]U+M+kQ X>w*C, y&1!4<#6T knAH/Vx$'jeВZĝ-g"}mJn ۍehGrIyx$&2LijCK;hTFٴ&䰌kzm$\`TbpErB)bN+V]Uqe = \`0V X)b+VyRWuM++k\}HWUUq'i0HbprsW2\uWy-W$,/W6B"V; \` Y1Rpj]b\uWP++\ZԹT]pE\EAR[#'brfY>w*c"ЃrW,8]\33Hj[+2r WUX[yRp V'ba4yR P_6 j}5;m{Vڭ@y$r4AITkډUkk W]ĕGE Z`Z;X%Uqi+i! 6mВ(WG)b!T␂*bTL,("Zds e0E\?Zkt>v.|9A&iVֱi"DDjhfxTZy“X"SvhИ7nA[% v4SZhkBJW]Frkza0+(prpjc"A\uWFE \`N Xn0Rpj]ew\JPpA\YmpE6bpr\ڶ6 ઃrkw rprQI^iWڂ RZR+4+VW7KHZ Xj͹Uj"NHo,DNLՆ`VGA(''wrj}t*MUqHZ&J!1"^{)br %wI\ч!Zj-*pYВ䑵m/1P0i%a@Ya{I`UqCܨۍԂjiEQܦMB`JXkgA"\hq*s[pu\M [+'b &b+VઃpEQ;1bVLtj+VbUq k bprH 1w\JS$宸"bpr JUUO+V`^)/ifkm(Wejg vW[NHwA X)bhs \uW{%I9 bpEjAejg ++(W$W1b!{\J W]U^ʱKIB>%1rPen 2֠PI<6)Ȧ!Y hIV9bϓJ}0jo=`;KNna?۰v(,/J'GrirM˅BKeTƛYlHΖcMa_`pJ X.D)bQ+V[ 2Cy4FmՂW- l \Y.q(W,׊ Zor XpA\vpł+딘q*s; %EW$U+k\Z}UFSpA\y4QI$((r+OhUX* HJ ubfYmqE*\uWFBP`]Zsef P]es@mtV0j |hIpH^\kƞ¡uC@Ev  [~4z$N|eZR9Menᱦ*Xpł[?O*MA\+VV$N.JC,~B\JNtr.rpi*\uWꏫ3xb|~ŋުD³i5L_<~|v #OWRW}ݟɏI]Fpjg;Og;W_EG}(Vڴ3f ]A>o n dyB f^iD o']}  W:o'c*ހU?!)ZMWbL;G'jp:OfwܴOB]7ŕo_&BA9v"ލGbqVO|_o޽dBt~֣OYpQ^vz`ֻx+H5f62U/']>G}uQkLOx™a0T:+EFfC̪_9=Ǎ:-:bpM8v_*jf__ܗtCtj 4 mT,&spO }/9+{ɟt[SG ><I RWߔ݃pOo4p1ުDǍ1?2p;hx'{,[ >-UOhd2xޛ.r5̯o?VܱVt5vrLr`,poZ-z3_Mߜ.Wy7`aH7֗|>0 ,Z y;,qmןjwtbzq:"އk>Zd7#YbDSwr= sS RaeP՚yͩj$8U3" T|ћY':qv~.1;[=7Z~{烪,O&nƲ>FX^u$7ܳ yn~mkXTlDN΍{ů|ғ[vM_ŬK :p-A%dZ=ɏJ!x@28IiB$wװVL)%,$+{Qc퓌|0U =j^uau)u9}h(o{O:o\h/'/F}2=9QtF'8fl6kw2c7A;;O1bebcF"tI9NʺЎg'Aͬ:Q&4"ZYGӧ'3 DH| >Ruبc%8].h J{[vnf>غ}{gaKyDBކͶ!F0ULBIn'J-.oqگNp~,{\qr>?[Wm?>On#AN! +0ɾC68)vQ"&ְڱv9|CpcնD` b1_)*I =N d*N/c:dBcH U+X{¤j%f%YH$٫29r|aC?Ū4%Ϛw̚ǻ!bL*+CPq>7b=3+3u`tFDh4hюI893i XH170=F5Sm o9Q ֧_⛟^~P&e}\?Ԃ//}/}CVF(ae= h%`U LFiVv6#, NKGi kMo%߬]l]zNm|~=n:iLx,pf 04$RLܱcv.`})9b2eAwGv*VU.&&Yܛay`^(13bM:T`e o6sWɕ[.Y 9kȸmFū>9t.a\x $mSER@(lQ: oVm!" uڌ-t#/~S8lfˋ5~_~bC?8SH6ֹaW lr=8猳hĮ"Vx X`9 rB )tU ݟ 9s~ϮqV!2J!}`,8ϱy0hBaQ?c< z]B<ߐ9?JFeq.\HNyTOTc Xy:y])(YC- ]p/J+7>Szy,f| ;h:7=MǩY;?D71M/1Qk09,&ω# yv˔C$XZDb&;ikk*5N]Jbik<3@E{-(QOV\8-Ax40d<{s( A&ĦDY%Sx̡:_!U&G& Kea)ܿ+1 D2NL ;,|\|'\[+N˩O[%iM.t<?/Q}^0C+=ǠhPB;)3ˀcb+tNãxG/!ðý~+ ^d*5l.WV Գ+ZYMo5ePOו:za Ԕ"fU)K2XM gi吨㏽7C:vr{ܧ׵ g]qE\Z ~ 9}E-buۏCޱz1Z* ("~8W`.7t/ sƭS`z9`-`6sU, K ƀ6 twF%1FSG%')q[ǼRT3c R*]`3$0w6JG\T¸E%_S$䢣 YĈ]Jʨ!p~ImFN(ӊ-D E\衊AHIܮ46833cZf@jp!|gBuY(\8USP\Pse},lLa3JyunkV @ BnKceی;5elӜ_d` 5k|8cH[&܂81o@Wq3X}DɅ)ut5!0F&UȵM~=?ASiz'> ZH4MvJȝ(K>-"V4 ~ GhMR.]Q'O dFR):瑣H ԦK#  ל9NG%(\[XkxelL8<A~(0=X[\X~)B PDI7*`}*DŽ3[ȋssnjIDtqY߆kTm*UF`@QJ%Fi 3\ -ÔGE+ovlL*pwyQ0Ib`WnTSĂ̺q`qPT$hWۑiim9LY!"6EtU(Ab]wF Nꇙ^$II[~(|p S/&Ba4 9BXD܇姎DB%p( O8+#hRT30`&޶hb8pkZRpr*ՂCC[; Rӹ+2?PBugY/@'S , JdZ|&/%Gt[* "?$OWa:ªÚY]pŁLY⪧sl VHuV͆`:I}~97|( (Bmq"U`_an{~B ,X$Ja7ˀ/J# q8BAHqX26X'cXWM.mT,QNtatFq7G5 (E;֍ "R|fG5J BL-8"ҕdJn{hq:j<'X[lx@m a2j(n]x joZGaho꒶j)z ݱԹ+Pf,ҭ]> KaUT9LRi沆v)!YW]6;svDΧj6`O3oӳQFq xݡ-̨J= #xr'e 5WT'B[чKFVZ-r\z:}%1ȨC(m+Al2vWPF!ȓ442F SuAGOR3qOހ)AAI҅M-VuLmV`+Y9aE(@iO׮91Cß|cd%rf7*cR`>Rkjs +0L)0w6^vFF!XB?\b:kտ%by xȨţE~;-//c٨/ΫzȾ*h(SP9@ sō/!WnU%E{ӗ#;9,{~&q)U sXj PN9GP.ED͗/)|I.o)F:ݎgs0L R@΅pfeAneT83c@KZiN)vR-J*1>-5&J)qS9XN2-[Z+|nwl>$[O璗T^UNLzS9mpLYJ2 @Zm:aRK 'cH}+]Z db~gK-yiY<[[Wx]2:i)Pg@Akh+6#1ci]p/X6rGgiq#g"IA^ǾkYo'ڻ;\Y-sEYN kl+}vՈnʰknOs-B |yƵ;0w++ X5pwz[h._ILݦ[8}]? 4"[Y( 9ym \;nG9H:]z<Y-th%{=S:c@"ɤ# œHа& >B~Hs݌5|iwS_ xwfQi9 lO"0 ׿kj&)Z: <oGZM&f㨷a`G唯 \4> 9ߖ:Qr뀭Wd~wAng_M"c*CPvۯ3!̸4 43}u|T$S6fyYNWdG;9גrvLER&!;}OÁ OSݮdVK^"2ssA@q^KB@l '$`kz|lִ$f+fOkOJHU_JBV e&G@L Iؖɲk!e_R/n,Ҥ޻jR9y y7w+6bv-u]S[ӔGWgG m/M땑!uVfSUHN0MN "P^ /(NMn K}aھR'g2(pIHW5ƋO!zifON!%4ƫa/}Han&kԂJX/sSžɈD(| {T=B>l"y|^}_NdY:O[:BTP@3#%Q\XqG5+WxmFNktY{NUG.S 8^ij}@ΑQkxpgW ü*jJU> ӷUN[d;ASr}%b^k *Gpv &`rTA+T#fAg0 k,kqffn R5grͫKq1yN._՚%g=NCywIO.9yEEPԫ6xW 3;litnR+:j9)|Dġc|5NӖ]nU77:`4mX2@cyX'8p:|C;IUET ȧٽ)7zY= k(,iCa~X=> kvynu]ƉZ'y(y]0lS>cMfܶ|gG'yP0wnd@+Ѱ g&8"qnNS"uheP_~iq8F yMm)r&ZljjjX{8򞕦IN>7|̗C^r^nqڰy< P?ac_,wyg/eӅ_˻[H笽79ו@^E*@i?'C~F3{ڞ !Rm95H.$9('ev~BWߪ g6b 0";𹼮}iy R?O9X`Ckm%(`a1~i1L7eg0@w>,ǎHʯ_W9l$eC(']9x97~' !(ccӺSI~a%Aj)Hh0kpQEJgJ5ȈVrKM'X ^ V:ք+4<_Au^^~.3&}&v2RCO8,6~jL>gq_^24x9vBu 5lpE' =<%Dn?_֪/p4 ?p?Ex'|ؚR"t.{ >Kk\.d\e?;I{M!yg<$rARq;G(qp+rQS}^] \h^. >\G4L]wM8p9J:Ft0T c0#($ *ILy}MAϗc'Q%$2r[Q9Η詂y$k n't)6-΅n :p T LA%`u@ N Tvi+k skJl(_<,A080t&`0uZӺS Zh^MN90d(rHB͇g8= -g# fqV_~m  Y$ƀ Hg JY ) Z#g\RZ"B4ؽn0PN[?fV%AơhG%210E\q|:Y JG{&\=|W͐&j6ú!uCDקUvcZGP!5-@2OS.#P~ X qH*bu}pUr*T2& $U1ysja̽@@ne.6_߳rϷ\fMXJ3:5wk%|m+ٌc(vNAr+ӎxgo#u cfJʏY$Se+GC*@9I"h Z[Vڧe}-g/hK1aƠ֑q`e@_391x9<rlrCk'"Z;f$$K Fyis-=SK7 Erttƕ)9WӭnϮYFIO=F<}0&I)Q͚$h5|6Q+ 줰ü+ݶnu+#Gdf>tR^(1? 0FCHȪIj1K,H?ӵ%oe}-]1+^1&1O$e[Gsˊ-nΊ}\(%<8V#*[Q־&VY#>;8clYa"[Y[ cܠx0CuMMZޫ 4ExR/cՂv7`[ !͵Zo3iKNDr>}q.IX/_{ɷH`\U`gb~Re0TՍsSU?)^[{ޜ:8ƸD>h l)jm"g fVdgz[6fR]5OIlY16q"|-AI;OZ?l< GVV( _UuUaXmyCP~&dp Ī`v !+מ<.gq ΣiR[rdØ==/eWfD$/O-21z]M s ~l4U0<ۑg僝5LoeqSaě\"YyenhlۮO24|>CHDTx4t> f {ס+t<ťۏl~?~+fNf6JۼQzԘGz\"׻ń( nʵbD}C`y2AC'P l+f;fC!S$e лܓF x|xza(zX׾;WB)ąg{9B"V SKkN=-cϊ+8w{Ha۶-|6|ma>F$;7QcX<}Nc;j>=8 Ng]%-$ͭ5O|44&GÎi7ةPбT*NAR˷4K6[?fV'UO_|^cv|V_b37FRM%ŢwПj2fvBK'c( djU\PZfEg?=d+2H0]گc˰dݭ<VCUgmEDA%Kb@$N2Nww+8r4gy7 f+06, y|^y˳LYt"p\+ ʹ|tz NH8Pv}pp(K#Yg7d0n3T/ >:!fJL56SBiKTEY4WӸ#TIF@LRp&btrG_mg 3wЬȥJF'6]lt8ourޘ32w̑ƆT &ieG}n|.eʟX)It|nTcc2E2籸Zf~u?Qj~* *rЊPg3]"y7$o˟َÏK%E9O.V K`oĜ;"%pI>~!R:Jb>y5+Hr$BTuqլirUCy3u3^#Vh~Z9 (AUbJKdQb|8ȑ D)KS +Ҧs֞q5zl1\[?wŒ+i cla3TD,K4)v*ƕdJg<*Hٌc[pRW u3)0|8b#Gd€ LIGGIJ,@ >59~q@ )`Vy=_r9Y NJ -^;yEPW>iƸ}G1 jeCimt5b%:چ}oC+Ƹ9 EPt Bs'ЕSOn-]vuq79ƸEJ.kcN̦T<QYq!yc/&\RL˻)L"R!&]SK˒^NqMw2єJՏz~yc›eJ Rz]$+4΋ADƒc!N qѴp<3)d:*Ry1Pj 9'i*"XY^|S@FH[I-Ws)#2=Tt5CPt<:ssupO^uXr¬ "?wJRD&yz\&ɗauX=[ǘ VP? ~9ĔNucؐY "9I j=r\ugO4[Jrvn:R(Ƀ˻f=gI^D4BIS )ҵ6P3e@T}ݥB}ȇ"SI~ $ը"x"H!9bmqi41zϕÃEEd͚i_N: (N'q8e]qfiwBPt\*FH lE|oUҎ5Gz\}[Yq̥l7~y7RrBpzTnx~m⼄Bbڞ\nP*!N\d.'%?$F =|1+27mE!2XqwݦT%c>[,&fon:[,-~ESxTE?ׯX̋K!0. qK~┒Kt?5YaOZG+I[ET^ iytG5=a.6^  Ą?5>hBbs ^bwFBv9LXcގu=8h(B, D8n8 sلL6UW_8v5/Os %@bG{"w>)bGƿCn]KNS5~*KE`q:dͶs 8KGZ_q`'o(,-(\NdY=hZHD!*?K݀KP f[TCc*)R T>+BDT-%BZYP9H`Js|Xl&}X$z$ṽf 3a#bMix|RUhYW$mmaK8Vj!)^r1+dQ&TbH cVC26HY٢DYSkxt)+ց܎&fnޛ/&fal`.L8LHcOHP w} %n4ֱM.X=ILMwj&wChTae*X'<_kXzn۝C.;🗁OP[vCϞ% 05`HBPb&)|(QɠïEΐJe!\ǔJé7aն_>;0^GrK!iUNO2XGn#e.x %BhWSQ{hTَ|)`*-Ρʗ0j^ e:3iĸD IAwsTs28s@d[M_8UU;k ^8(բWv{EӮԛ7d|C}N^?_tb{wo~ r{ejC6$˔@u^ \](+d8J _,[g i0τ7X>οyPX@`h^q5?q^ꘊ"S}lt٧e-4( }rɲI`8 o|=}&Zs _ޏ3vӣE1ZHb s'R^,L-0\wDOFoY{SEN&Mb^<؂Ij3V#גZR ffJD>*syfዻ/ X>ýIPv,xq8.ti2/ϻťBN_NڳDi/{ms]NZ d,Pzq{||=`m1kLR(H\p8Duwښ8P~'1RHʪi7 tw[Mޯ&upOvFwZGprW9 54ze>q>ЬXmqQ[% ," jh\"'H/O-oupsS=<'sOogmY"92 m6D,ҘC |onǔrodBkhc&q:>Y&bo7)!]g,[3l͒jAWAڜ Z(zFgR7{CԽ}mw( ìcFOٚjj%4c*T]< o=Bd8S(HpEVf~PIRZIӖ3-,M=FhB1=AF^Ѻ,#͛FA%Fo8LnAmjh]Qgt8` f"$9B ,fa5;hxf4`{>RB0N}} V/4l_(ԒnQm6{7|v_6GGƴ: Ōv|bR*jhbAIgPYJWSlH C[1Lv7\pM /o(b .3%I6 / u Ֆ<_Mѡvoo0Vohԗh逌9c>"*F[Qe"1ApQvr}P)MbCQ{aT;Hj.RvirʡKs^pߎUqfF nVC/,1ґ6\gC `g|U6H *QCԫ}9NZvLWiuD<72}&7_-ErN]6;v ,q_e $"z1֥I\08z9<-W\4x c 54zu. z/7E3&NGSahtʼ2 ][z\~?&W8=#:R$nle48\}7D2vj@#:ħ#TQ*aN <Sj~BpOĵ;!u\ՁS:Dz*tp+vuC5w V؂cVחZO^[gS B>/O,}`Zl k 0PL%[/s9ï31Tr_W7G?|N _1}d𔠝+挠=>M^q/5?qܗӪc?]{42W}c8O1S:vx0i V_@o_/:k(9:u|<4L.K _?p%!8_[~ ŵF eԙӵY1)_͒/m @?K>~[RW֯?0J\IX@A<^;`q.TwmH_nq1X ff:v,K$'/ߢ$mIMٖAƉ&UdկzKhkPkԝ#aɆ)KV'^E^JP`zBJ)31<#p؊20o>URJ!@^kW_d>oԥcK}!Ќ(}xz-<{,%Řd,PQ>0lHILI~k׬Ċ~fk=m_Āa>@yj2 z"s3oBS#t 1U , J^XYc FtP۬8;B/OMi l8W٪qkPz8vc UtlKZ"[,c=dBbu Εne=EnQbX"V+yv;[fO(D;J늡ZWDarj>-L,YRte ?TR^;sf*CVil9KI,%#(:J<7]l=2CӀVVj`V82.WȺfc?S ' .҆3Jl2G#&(U&I,Qhd98V躚 ưHy^tWO2ZSQEҀnWJG*bb3 >hZjh\F.SLۏـޕo!,]O̻aXG#AGJ#)Xjc.1(iIeDŽuu;NbIW-sTY.1}bj}jA#*mPi̔ V8XޏyGa:XƘggW |w~\|jdAo~ le.C64> 茀)5=&=ɤO $[P²oNDZqG P׬tHdKi/{886=4kq Q$+COJpf1eqKLN^YjMzWZavO?.Z?Ų{ Z@CUPe.zm٫Bf _T ^C\,kl>>%""KqܾI%sݡnMRQp$Q@Ri/V((:Ph/aX1@~%(  KT~@(KTЙL{=zIYBpt>7<hJȴRy'x%iqBj::xRuOV:4-SC1 (N=xB>:3ˀ?%%i{G9,jg{q$O U6^|ƚSkuű3V2UyøV@DoMA3eUOb10G(2qQ{FgW4[FC,)С%óTthKe5o<ɐOCa)(fv2$Ծ8MϏt`\uxΰB$Vd=15ud)Pr% 6~ҋ-s`x{5:Gþ.uAT~Qr|r~9K$ )+DA ]b=}U*UBKe-~ ̶;O؇Es)2|Kt8ͨxY:K:3ٰnװ8A$|M'aN'8Vu{+5}?Μ`K"Gl1\2Uw[jyr/-WX^~l 䔹vG~wl(VefFbf&/o?~/?ӛ Ghhhsw`X.>:C?n'חF+Ss^FUZZY"(9v8D--PWg2yzmmcMf8U1<=5d͙@EFQYSI0.%[M*Ԕ4˿* JOI);xY,]d(X,BD2_|X9KjvˣݧC.!Z8M23tv%Eĵqq+9= YlP`M4Fhj zb_|\ccs;YOlyvia(ϣ|ysw3N݃K,^eR+~DJrR/W, 1id=*ͬzM8pxocIhKU$ *"BWH[*ocY=E)%e!$̐R̴=jo`jcrnǏ[ONߒIq-sWX@>bԸ0s㎭Jm4}l7LlyXT D&K&4Uٴ<[Sg6҃;8U򤸈9^pDhֈϳ ϋtey 2ǫ2IL|%h5Aw7Mr湎Mۮ$*`7O7w7.0bZbWޚЁUXkUC>!ܦjȐPcT.T\SdKImHh0PFT_o^{#@^hJ`)paN .%g7 ?g K\`{?^8/]^K{yNLf/)AcYuu;ReX˹ 3״Ng?(|3n{w]gC kY]z5;Y' pgI5^y.Oot0 g?<0ԁbXqdzo&xsc=P&3$jŮC1 sOxcm}K+34ZƋn>s@VqhZ#7(9Lӎ|HX0b=]Lc_c{1j1-g GXb%3ߒ$Ȏꋥ{cl2}Rvlcs'&z8]ޓᅬi͙ =60JgSq%W i(UKTtsXu3Y;S*ڂ!Ewe1}՜JUgVa &]^+$}; v-HJS8錊 *>))OFɝ9~kPQ}?kR/Ai,nTZ.AT1lg@B3S8cN;̫9x(z_w5L"}鿾NntZ3$j@LҲݳ?@|H]JϏhPbx@Vjpۥo-‚ \uж^=ֆy.^Z;$k̅&Zf׺xw㼜ׁkknFE}㠁nyVe+U2)&ŒLZRJl8rLht_ )N89ɗ'z,8w `6鞠F{MRbR{; L0ir;?c7(m3g  /}ύuNUO=Fh!~GF*sRb|`"cY4`R ATCO;@M T7qN(f(CF)Ǧ !1hq a`C?{jIe.;3%5vKY6I1e⒣ 5:N68z<v[wq9I(ȒATKJ{!6`pM%eƋzOHhAJzK\i_#A8f:G1Go{_2D:MD:rtv9KNemܨ}5Ĭi$fM#1kZOګTY>QGl-f$$iWX`+TW۞T#[k-Qok ΘÎF縆Eju4sd x!O;v-{PGb#fމ09ja2[cs%L dUgx>eM[~qd2SDSA0t,ؽlmB! < R%@dɁfMΒ3q0_Q#`6&L6(C7qNpҌE!1ʼn4uA%1'u&Oj`M׺ ńl)T,0&32[=Xvt;@ȗldPLi".XղTRsR x)8$:+Od!. QmB,nG7Ft&ZM3g#Ԭ ?O:1>&um.z8ދ(X~nØ.ą-%ZFɨE#q6 y{l`2wWAZʑS rkAAN]iVzW3\G*Qw[r6iLr qd[Mif]sܨ.ΤC\mz&"gj4IhPk!#6ۯ|5a*}DޡLPxg)%Fp =`:kk+b-٢՚" w2KLF[69CCUl2MCP[e):&9G9s/p)W4fxo018*s*tg5ha3L@ J5Md]Y5R*1Vs2[r匐mnz >Ys]X|@&~HrSVA( ިZ^Vb)wñCv;*#j11dl([ , ;#|d4pS~/s X~\R ɗR%mL358}k؞<)ɶF{NEF+MKr˳Btx ` ~ j&M04o ega>l,{\k,K x؉fmZMwM}1VGڣV٠ @mCõ I2P}7kԔk0 T@?2՛I Pm4}ȃ~ hD%_p!VJ;ib{YsbXlt2_{f!T<|ů[bYI"ء`~4fۼ!2N{=¶I1Ʊ`/[譏ϲC_o멷AeA\NnpDQrFmGƤ>5Bi)a%:[̂*(C͌ <P69T {Uջ-`+U2.̦b^"^ #9a7(@6|!WqxF~F!n 0B%m)CS@Ip9E4E2kv$ C +Jk\ǂA-s*Zxi_ßb17黓]AZO8n:{rv~rJN~Ng*oF[I5Yl< 0p,mZ:'t:YlgVB6n,ƅ}"H{G+(tG)";R wG;zG#)'kܧ_])׻w^ҝW.4,- 3gaT|2O[Ta<f- |aq̞(#\9/V$?_aF+gd@jS*@d}r|yzֆgq ~۳.YR޼|%'3seyǟf8Y䈵U͹@d<-3',1ŒDXOv%rynsh,::b7FIn`ee}֟ìACEzfJ9p ~\ec4G0WA+CHT)_4 ĵ$ כaoJvw ">D2Jt(A׷.| E%ҧ( ѻ|ZS X>QZ1YnH#4nWGnGk牜ߕ^~TKAn!YvT|6CSUnsq]:wsėUCFܥw2U Spv9+"k}/ 㟑Jœg B% S~Je J1sO)x{S*C-; ʻ7S/gF^Bi)T=8#CmhHY4$U2mi&0⣻^ M 2/ޟ0`4{xzVW+\rd7]=ҝS kШ oGXmj}UKXc koτV.o)}ݿb>a@JAdV_7>gB,=[M1[DTgۗu]W,>3@FhZ2?!QIe@уV^ ,L[5B8]q׬6: 8;*ֲʎT2GP|p-SZJ%1c03fT>˯A%Dp9i. ѧZ|9e 4ں'Ѿ4+҃ZDRT|=DM 1j6C(ͽЬ܋?XGaWT"JUY"J:ЌmK-%Ӿwsk}E;[ I?aĮQ<]}^O5㜻֋ߔ<]Nw*w\%'Y~y{vi·k6|kwo.>Z[;~";@+ f`~۳{WՌefT~ye1lŭڣW6|̼^8^3}W;_iA:^cntܷmp%1}4Z=Ϲ *zh̩EB|EsՀ o8 l"hm+6Q5N;;-.faΐr*@Ə]::jG6w|ݸZ]ֽ&s/4+l2rqIm$PK: UAoD&i>dmg_~{BCkУIZs=7=6 l&QTqU": {iX'W'Ga9ٵ(4 X&,%ַ˾|#[7 l҃IK#H\hnąh.nLճRc]C1IQ77vKŸ>БZ=DB#Qx==_}/fŘcqͯJsAh4ow*qn{̹%&uΟm%KasSrMD dTiyW;_1~b6-~gKn?$ZbDcb ¢ne%f'͗a:Hh $C5[Y?-?8-7c+Ay~1Pl=@]QD+%f1cψ9off3͇L퇷TףhLGmPk\tX#%|-#hST auH":s7u&%I-VAuۢJn:?Jس_hQ^`e?s' ^ƝpM܃=ۻGW?޽Ѷ\nE~@DF0w )#G Ti8ƥf\.#KD@d|Oŗ&|N[ЋN[ MK4 MI“‹Up&,e6ӴU&bţ-\5׸$!ڟΔMө_^CJ#Tedm~T-c\\h8z8Pw>$h gii5IJY^yX[WGҎ\qȄz.Y$ۻ@Uդ첯<^D?J e^}2?(ڣxG\\6]J(NQ?V[ [Gq%Kj䫅gcn3[ ;$+UBڢUCw@<%wu|Zz*99̐-ODZiaDnc>x۶O3cz>.y51%M1 T:>r-2P'~-ɬnE ׏ɩoz;=uOf]o0/·kZn#uyT?"~ӍgRgd\&gA\5z4d\T3Ӓ8:2HI@ȕkpEs@xsBBӻ@/#X0WQet$nDsOr1$WCV) XLaZf&wfd-VSVd'}%'19Z"~$otbslidg%#a(!Eݏ\+7`v=lrW2SoLAEeX]~mG(9˭8ꂱ%wjCdIƆI@.?:9J1P[,5SbI-s&!rƮ@˟4]7q ?YpeYpp8iml LyYb1-XJ. t"]sÚBʦbo92GS-%Ψmt_3cC)ɟ1;q,m3D2p dIiI9!39b)+Jq+% yVRuu "ݦ%3!>S^f;eS^f;尝Ijcc <U{k$$b&dQC:]9/ ~tUaCq] yO8}(?кs3H5Λ(n}6}/(t6qV L9"9B7GQ(-$[RtJ;Z[Z tpc,<[oqі2| e@烸O@'B6@eJ7 Ar{A.V7mqIV4%dJYU]g/YȺ+R)Y'@MJPuMН%JnDuh5)W7d%P)3gGB>n=(h6J-9>R0١%$[-:Y8 ^|s:<=QkJ8v0;H`P+6Y.ô3N}4_+ `fŜ8Ub"}, t'kdd75.1I&4m%&uFBOJ䨟\k၎Frk!E5&c{Wc캙#,@TVXkyKIS&)9}L~eJusζk-! 8#W8Hf 4݋U_ڛ>>7Clsqȕ<_HrB1&x}^4}W^2vA>+t=#ЕwKx0E%ܽd9sL9)MV4whC;Jh?b`+VBdwhCgD@ Ǡrf/W>1^wc61nx5;^B$[_Icڭ#$gZ?q4voV,&{n͎ߓi6y;yC4G~n=-Uvh@;CMRXE7xnv+P)qh<'hĐ>`ĦivE̺d0ءM(wh#0%z,R"!;vg7*TOa,hkŌ %uHt_SǾN sW^*)Dqѿ~yR]BZ /jڂLig:&48 R#CAMEZq\)kj)K#i$-yh1UƖHCܫZ[(;GU @iɃ3ECwJ G`#51rLU؞GW|踳mh ]W(4H{ `+|Mo[HѱT=bHH3p6Uǿ=Z [p%k8[5u.OB!ǜsW3BeYJb.̣"cޣ& ;Kg1+7 ӜZF ͲY@jeF͜*8L.8_J}DI21:; Nတ+ݦz8F*6&v¥+ǭBN%e8;T꒦+"nJR+ШjþKF0"Xj8VLMz'LQ"9Tb̳ۈf:*edT!sX41[a<&I)?ӜMg/AN&~ֈTFm}SH|.规qBO}]|[RpsA)6ʬ;=ڸ0j[0c# -ic5R`sJ1ڈh#ru(Q6Hy4M0űAZ{/¯%bnA9BҕSbD=bGV[W;j U<2KB؃5aS;/#nptyT%]1~/kw&*Kp71k ghOY]>~Q+McEPŜ'\ T(Ug-lZ'▛%I7-<~yM b~cJAf@z2ޕGcSb8B [$]?u%0PxP~pmIN{YG!p"\!/ʃR)P,F21]ZF2RO{KuJvM9d*@8ҏP 쾮56+vVǘ(3`-HJm6癅q&P)Z|8j8eFSKbtO?XJ@t:ob!s||M _.G.f~@?C<;: ʋlf;Oknh'I*! `:*Urh ~ ťb̹n88&._twT15̢`o깻[:t\~*3k޳>3 ~ɫr72 0.炓B>EI6u9v:[>oK+dchKn&8~(b*Y# Q3mzBC667W*u.˔B`pQ )ػ6$W~O;<1X<{4-۲$,$KuhSbȈ/h96*\mJ}"6!'HRl}AR`D,/?9%1d4GΙ WS[.i$%Kr"XKo֊I,m.])UצR9Y$ އa L5yqٜ}ӯq Ff p2 3\儸a{ y 9dAbjE)zqaAzҏ;R]e<,y9Eᄾ k=1-!<ދ; %Sڇ:[)7[=uU%AF(b)~燅^ 園U;K͎h7,v!s-_+.٧{BpO-^[xD`y~"ۢS=LZ}XU^RY=u9GGiOT?"r(DYK}#rCPy`:sfɋHdEDHo}F8I>2d/3ϛOg'9@vOrDw9;p\T-fd ж,RU`Fg0*-qIq G|V*ة]wKC!uzE3%Vn)x"E#n)XA}TVmWiqb)Xy%J̣;My_z.L^k =m8ѳC][2n_3:&ΰ,k y}cO<qSuWxTS_s`xn {r7K^L0"L8)o#~3r<TJUUOE+~u39jqDuܻ5]1r1S.gi 6RH0XlfN).@;$WPNLu*QT(eKYTҰ5IM;$5\$4(0 AnjMgQ?t>}Hx Vߋ:եOMq>uŜ]S88.%5uxDquJj\BI3}3dbps8F=&\(:cdIKM]js҇ kfDUԯRHz<]f Y$>4x.nuhA]47ၪ/c AXh Α9S/}Ǥbpwr*qmvqUIz!Rde(#}nGU3EIyfPII1˃2ŽsyZo*ܬHdF˂[p>z*o#n9zdۘ |B߀WɆpUQjr6f(AsG#Q/,s䖽{uKBB5mw%L*L(Y 6pЁR{s6I`RCTbugo)ep $u;nMQ݅C]'vx8G~\(L~ !-2k6gt6Sy`}U^4bSRS5l_V` Rr#TV=?S8Η0di\l{KC r&_ d/'WzƽQLqúW51R03]oJ)]95ܓ>%p9¡ČkG8Htw7b [ &6d9L*7ty`4ׇ,)ϳ"]u^Sqpہcy뭕qf₥*=NNyW}x%/`j&ZsC~>lsKljkK2ctSd{ 0}vkK^y;0 L$#Y8Tu5!DlԷl ž"Z/cl?Fd; ;bnl߅j;lgW]N~EZGGwqYW[ޝcYBx} &9<5:N\Sz~>o3~RwwI>3pIuH;+>Nv`ɺ fHVm`ړ ┍5Fwi ̌LyTFT%IY o=SoF3}8CG*d'߃?V98@d?%V$G72NC *K02׍m@< \S iV>ߵ˂1qk YblKvwT6)F4X DtUVs L=|C7v,SCAP?,-d= 2v8E`-.N]57RfZN71Z5KM7paN]7^My$t [nՇJ=d쐰ԾQsnS Vh$`h@IFNj픍 GybGhw#ceL^;fLU7*Su1UN aq{a( ?K{v7w켣m6RFW { ʜ~){̻h /ao is4ݣ|&{wuh~a>O0īݴ\]a=#.w>%kMnɯJ~0|u!vgo.Z]OP~d.Xϥ@XRR+UlЇӵ8]S HGB3Ĥ>^TR៭Jj7l5L [dž2ॹ4H@P\5=5N$Q9zR.u_bx,OK蟫vsߵ=tٞb¯Gh|7yW'|aojp{/W9 /auB/W*T_5zǛ> od'ޤMިVx催<{M&%DbI/MP+\l%-I>1A!|LFwkBR$eJSqiLY/k\FG;β^'^S$B-%,X"h6kgr]4ڑ܉NڭF[䤬4X7-ojfWGEހeAJOAߕQVi+'cyq |s Rʮ1`Һ!8zɮ_޴?O.M_Z`{R1 \}OFgQ$hn ܋ۃ=d}lW\?]8e d!he]d5Ц*YVJ.{MmC %]o1{fIO=4n[;gk4-9D@y*a eG1EPc@1^Qo֢ᇳ۾u|w0u*a%WWdԑ OFlSv$1Vm Ƶը(4ROkJSꁄ3V[V *"ȋMwPXAcV 9R*QЭؒ{R#7fQCͱ;1cnܻcq)}ӮJY92+[b6i)J.|Q % KS W|NJĩX̣z:f^_؇oDyPK)`aOD7؈}=%DhO% C{>bm(^a\qUP7,b%l j{FjPި:̧&XDɵzmU"oF뚍Hd=b8׭Pnȍs,v->LSvĵ%/d"ڠ"h(NJYB|atH575Yѝf:B?krbQmti&)1z缘G3k>%0EזH]GS`0-!ϰ;V8%0.Yukʺq=uݭ#2.!' L-jNFt¶՛YB>=sha-r {<:O")[pv [3swuUPf)ϛ7-'yF8}ʌJMr1_1ӯ1ӯ1ӯ1ӯ덙.PO>$*޵q,B,fYvv}ScqVIixf8$EI8E 'I05sA*S" yҖ;LJq{WmtFu694 JiہT7'܆ר8>:#w"b{⊉&l6AN⌃Ԭsv͢Ok^Vء4j<(w!Z jvPm"^ Mu5x">Yօy;!ܙG#GTz2S7fo ؓ5Ѓ3sF#ui$V8Qxhޤ_TRe v=AWgikq909irVFS(!w|t⭦i99V95K h#tαyL0=V Rt?03M8XJ VoVw3>~[Fu,bߗ)oqcvhcND %?؋k]܌F?33ZT 4Jy(ULÔɟίoe`e C1ӻ,rf |Q2OūaK] (eiѳ*mK([œiʕOoj{—*j.T< !z р@` n\vHT):x@-Bm㉐K }q>$"^JN'F,a4qJpԉ\@syq=%W/r"W/rZ0ȔDJ RȀ܁t nLVKi Zm[> @vޡu3$󏊩ֲ6_+bo٤PHg=s,g,8;V4n&]wѪ{#> &a'2B-\_VHێVr{:Q,fSMf:jRP#D//OiJܭ?_r>*X.?ߛU>|nN i! d=JXaMF1.w:߭oS2ClV]z1K.dW_ZPo!Jk!6V[|jVoKzxS[nVӜ bf#:8٦Ofi |>8O{Ee51y_a0SFf'S˝Ԯ]McMzwA{EH<Ʋ?p$bL4Ȥ|96G\.Wg8{;E9ZŚ2i7^&xy4]U J:y r Հ-qPp'͕FˈxK^XBuѳqA 912[mjvYjmknn/=[SaY}A3q>ΠZ?6 -Oe"hO&|p$\$QO;8RԊ5[㮑|kZ[Әki̯fILF #[LX%SRHBz'CNCA{++9̍U9D샐AlepLeRȊ@յZ&́dy:lSS3n4~g!)&dR FdQ 9<8" PG 1FPj h`w06<2Vuv uP/oXy-YK iiCh5$mҁKN^kPg&2|@5ҨGeC~_]qtܘIdjf6blj`A _a@콋idfSg8DdhG_.ʝ1/KZLS)Iw"4$]@S((ըĀ eA4C5mDV٦k YܕPы 4۝{oviިS}4X&]wϚieq<F;xvPJpT#8i4#xpUkxn$RnPܧ.T$P-/pUke`Q&nK3E}(Jo_EXW mRd?*ůRNsl1| \}%OGܸK@ 8_3;o+P([7VhZd7(8)X`\ ^&~ Qӏ U-l,i 9L}\LxҌT@s9Qբ|3HaJɳDv,UϞ٠ѝ Bդ:!V{4DYϫ) jz<5Xpi8fznI Z\YQXsSg!,@&gB\E` Ԓ2y8gyqȴa@, ZI?TtZMOO.G( Q䤘Ӽ@PghrGvԻWߖVA7.fR0aeZœ<ݝf}~) *j- m. Z`cL%-0R~6s r6T݅7ٻ6dh8޵cEl}|H|UϐԐz8CJh8꺺:mc.mS)bV@;b6NQohzׁXſE^2Žlea36Þ-ŨbO=[PX;nϲ$֥5}=^_-ُ0Z.z;"D!OVKX!JQxjG[h"܈*É.cR)&ubR)&uZIEGk FL Th9g` Tmvі/̪Tڣ]{9+|d{>9zԺ:i1~ *d"JiTp|6UmƮ;q:FtSFLa" (,cu90R j EHߙ-n!u&I8Rs~|aۗE q" 2uCOT\\/ '}Oy ]#{ȽLC)!ўj47*R>ߘUJ#ЪȾEKh$y(*:r@wNSlImK Fp-M~~NJF~K(Îk"cĎDqr I]>Z :pby$- dGݸi] B!#W nwBQ2MUNX4#G@)EQ`{fF`(Gkh-JHU:#eAwZl f+.*7-WbŒ6(rݬhx+㰅Xv J[nȃ! b,,m`3/1[_meԒ;ih \XK"x1NI-wX>F)<HʮBӥ}oQA$;>";(jU 4LlJ B7 nPIL%Re=GIDU =3ނQ*ԙ'ɴ #]g:>Hb`Z!tPW3z1ОRFڢ-EPش v1dn{J9T@4EXbTo½%h|cz 4  V RIqM:"U0/q˔Ӕ2tRNSi=e^H,8haF W)"".aE6V'g ."ZU}֬5z^i`7Y3i>+0Ӕ2uUhk8C!S@wnQ^kj!{mήzFN3rA@)y;9gA4Rٞd IAJ}h)[`(0P*m⮺]U6q&:%Jd`QB"m/EͦWW;h7+Jb S}wQ?9y2DK:3|2× HLgP^™[1Diwo@~$qh+U /XFѦ5B09O6 QZ:)xeތg7Y=St7Yq, a'^x鸲(0G`ŀEc&av)X䯻d@%脇u_YŜ , h,aHK \S O'<+]Pn?Xp#[UHO.7i+B"jv}xXwBO®qlw!6;E99h&$Pa`ȏdN.T5bˉdom?#yk-?F;ro^!9ge \؎ʆNT!p{3I Vom\E:ffT?Fͷwrxq6]y?9> /43]ӬDnvFҏfwmmhZH.=kpF`Oj  /ҫ {;+ D,qI;rA=%^7N!%8%ݐuZqi*Xĉ٠5IAZ.wAyk/UV+J춗RS, ]?{rN}ui`FW:].6_ٵ%!h(xZ*Q~4;&LjJfmmc6vfme64N< `9J !+(? d};I p4Z"SБbLgWTD% FvDǃ$jnƲ$ʺ%Zxܜ0cNG璘gBajY+"5UG$u?9D nr~Gi>u.fֵgbWb 5yBvi.s<+hrUD̤%4X)es=;$:-pfIF=&|GaƬpa:s|VV5KԾTȂQ+ 2Ҷx2a<}gsbj;9{bvrkZPBJnv@ E-aG 3i*7Qe6"p"m'<)tM bS\،f.. m$LF@MYᣦɬ;Hfm@?IY ;ch-ot BUk ֳk5Zϙ{{Drvx487gl>tZz-tn*6GXSJv^zb~`FgvJ<kIU'$VfSOϞ?;<eJ|R"xgΕO#Y!GxeO Mk-+Nh_*(15HU/LUMsS G? iƇn| b͏^ՍUڷi)|R6ѻ^#uF>|!M„Xy87h&21z$!P.^5 浭VRspaS˾%mgz#0_'' _?ˆDN*gGN ~ydoy,ů瓓M>"r9~6zW6SkϬkxw1s`~^#0kfW84qe:tCt+Eƿ ?qyO2pT8–'\`볏Wznemp6]eWhl'Fw9bmP ʋ"N9;ٌ^b~x/w-Kry9' o`Tpj>?_H7З}?~0iH񰊯XEpC~?^J1+\i4kcf.Y\J\WlaF.*w׶QyQDqQ/篘We ҆-t0#(ɖOԽi)Og{|4{;LB Ė'M'$XڧqF+s.Q&!M]ŷ: c/insil5{CFk" qЂ 0+y.Y'c瀪u:eę.J)8RP>F;RQ͘AbG;H̟j0SBpeA F, PJ2ː19  Bh@sYVYcT{'u. V\Z?7ad^;i4nlp: )Ń.$W,Y%&JM` `&USj k d17O沃f(vFuN˳`,ih f.KQ|gBihMhrFW41XV8)EW[aE{oB`L/`KCǞ isEAr"ݑt"oeIپe͗YMEUdߞX>,h9<ȳgjƑbK/÷)pp [;N~%IJPgrgw{$e߾tY댐7;{"=~jy;5Bь >}vwT`kwDGNvpp:#T(c\h5ԡ:u;tHDRFJ:^~,5ե+V# l%TD͓h(hr1YQRU{s[r-48o7 6Ro[_@A#(\[IF!bQEjRJVYZ @x#ra2I/ϟ>oϤ9(pfyQlᰰQp~RK\<©ԱNyCRc qT" 5<ةO+OfǤ%MxP$0_G5dGQ?s*Ł=ߤ҂a)z'bhfxWW=d'|P?*B`D<#= #uwztG0oQ-.䲫z*Jt BGM8:V}t5UO٪tW%\+CyqGEc5k=zw6|pA/1))B8ip̓_zپoJ)1 Iz'YC>X5U?+i&)q)P2s'<0zﯹ 8 ~sv;>xl2 v67.>ѻauF.'Wa}*@HAB/aJRhW/0n[l~>a3:\cЀK/DU (}}z$E 4,3h^h^qLքG +B zL=?h_򣈚~;/V'}[$ _@\룋_"F%I.l&c=Ϸ, @Qm-?<"ϴ7u+ yߡU$v'mh'\_J .K?9R#7yr @/ϯʫw?~ Z{wwV4]Ƨrj1Uqƛ w mV29`JJQ0|}[[\@ "$XK'"ֲ;>`=?-P}Kd^|n ?wo'{4&=O6H r}'Q[PTTnL'%ĵ?ltLO)ktmr;F3Q zIU|})59c^2 3 4Y^ ^p%Rt{{Ղ r/6+]V*p%ӋJ2wQĊBEVS91w PS ?QaDk)2&`8\VR:SSKsV f 23uomYHb8|e]DZ\H's1A}m8ٝo ll~D n^L0c7P~xbMv&zx0S3~BaP-.S_6P䳉K m5j _Ǘ>x5~kBta߯ c|Y`$&_e9ISRh(z?^\wбej=Rv^d<})IxNEpt}cyesl" U4Eכ:`Ry:cX/Sn m U4I޼ٵnUѽuKAꤎc֌"|Ŭ[DCNbV=a\ *= Ҽ9W/Ÿ)'(dU\R3N:rrZ`H'WEļy<j=>E,x2'E:cHbpl&PSͩ]6үwf^nWחЯ}n/L~*LsxdMwd1|xChO&mgNELWA2ܒ3 ̄@jQ+ ppZC5);a0qCbI6|A-b~==s bsZZDBvS7kb&{iY,x1\ԥxcd30d_S?U8狲dr$ƨ|ud&`KS0&]d+@d!S+o#T cH_2y]S 糸s'?ӎKlsD8:|T'/ZfMˡ"UNi^Z!8DqV$$: 1rStaPEuUHQPT= "БT.vБ2ɤa<_ ׿Zwh ``ZcN?gv2zH͘*xMP@/c˘yz |TkNѤtLb͙uAnN+J-åv-B3XtcD<ъ5J1(e(@εEIFXSNA"$Jb6+D"}NݑS.9B!T*I7 :j䶺AGog\#/6D3RXi1n͈Lgkd4•EAgN>NuFP/g(lr(Ϣ.0b6tk ;{칷ΞlidQ ə*J4JjWLmUBK:ɪiJB؄ M(iuT~sϱ4?e[:QrUK 3~? p -ꝁ  Vu!e]hJ VC3ZkgѴcd (B%-o;e]QSrZnl_tqF (@Ea _FQp%Nq#tG•ť+i-6vɘBCg Tʣv@D8XOD F3LA 6N@N@{'PP=3ʨ%pY7|I 7K2 n%{Fqi~B~`Ky0;[zO$Qwy4)NSK/e"E#zCsq=6y I˜(e<,})+ -I!{}61҂J0W9ʩҹցl)zLzo1E`(Avha=Z}`1J|{!J4 v"%(6{>/]Y|2(cl`KAlT kkD̎ǸpLŰj<0cżk0ޚc*6O/.~2&j#QIXKT\N3a*FȏQk␍u1Dn/'ʄ}1+X%D\~&DND/2GCMXr.)WrPxsxQ (i% o#hQR(r" NOᤠ%F"J%~<Ն3:W$P;SZHٲ _BJ_즭4W9/+*$"R2jXt.ܮ9e)R\tu4*6-) ywqT_ S \UavV*Л_;T ܮT4O#a=e nW0|*:^|%}jʃI#ƺ}L۹[DChN)yk M& -q(Ti`mvTԉN!߸T \;VMW޽Z]J)W +/ѡudY^iQb-,0]N*PPIJ6#qpL3m :'VT?W(3I<]uwd6 ɣX@ LU\o&)|H>N3OJur.Yd#@b*U{?Yt.D~.. pꮀWu5N?Pr.n=\ %d6&d Ϩu}yue7_. ߀=ωnqq,ʖ5_|$16r!YwݡZfDmCmY%p{2y7j2@]9>l>Ҿ~3,o=*3N̜nLǮӠ`O[\'"[6ܵE1r|泥5vw [9ߞu|MD?!gƳHٶ.J?{WƑ /qSuC=F]MdY l DvX"X/+32;U>jx N_ij9~ n}i[3['+s[DIW/Y2Kcmr P^Q˙ڭ1A#J4i4j˵_3 :/ J,,B2N F,)5p!V=BP=&Z1v(vST[ę.ʅ`Jr\YkH9KM*y5 &ƺ]Mϫ+Q[㷸UN^|0}E/snH<9\ L;%بsB vlꘐ{.@eRbmRRlAZjuK?4&1{CfrH=ؼYQ>_=vc\zTEz'r&9aѝyM퀨TtݡV]cjs;5lHhԶҠI?U4["b g;+眦\)Xdm gH{#+< <-eiU&%ۯ|6Zas*K6ƕȃB@VkPN,mG:XJ8a[RN)$X䈀º֎nuw!IJL ƯmB* {7e!U$OlgB\3ϵp1gJp,BX)$.ppB,qB0fֹ1;*&5cZ;x=2jB/-43^I(MO@bZԴmvhL[51vAαv; vݲ MnMH.Y2qk7QNңvAαveG-Tք|"L@N.[KH&rzMl9&!2B.gr3^oM%f^ XJBBbibVLKXɳ'l+y O+)OhxMuħiLjOy6U"X2LȖ7Η>oϳͪOժu=0l^Tt I 4~6;ucc65nV˿A=d\Eȏ^Er z#b,Z4 ^6ΉxN&Ν>Aٝ ;$Ui H'a$l86]R`Q%7 IZ vtNX(jdΩ=NQwը=׌Ny5iO$@B!, rI +&w &$ ;:* ܃R= Ej dS'~D!Alz7Jvh^zRyÉxz9*ԯW1s˻m_UMi W6#$=ӱβV9g爵IK#x_58[*,WOf ju!"O"\>șgeQ y?{#te w, $e?t1"@7^9$0kÕ{<+C5I?u8 Ñ)aO6]UV%|'wg5d+-y85з<FSyo(,<"vhk;١a;FmٴCZP!1uTLys*YhBBQmfnP UC;Z>9jxHĈ}XWlu^~^OA*?o(=s-[ľǻgPV-3>݀7G7C^c~_X(Q R,) gϯ@a[얷 #-.F(r,2};ߜjn//a|\({tq1ZbE;3%L!VI'S)5K;dY/ga Ds 00ǛpquY!u%x07ױnwMw_~2QWt; 0g|6~&Z8f2;MLF3w=z8N _@cmV8^)^O'9L޳wFmE^Y Mֶ=}k xVB#/ VbThUQb0+zۏO^,ps;$uM}LdƝ%S2HbB>;7hz X<Q,=U9~Pw5OBםg < >,gJ"X2HR[*uĨҲ\NqR%\F'hN I<"Gm8wr0ʶu̖ˍ 2T3:SF0x:-U=8JlD 7`UkS]`]uUBrHTlۃk+|g|8cg$N^|QB~];k_pքqg-̮'!q M᥽v]/Wu-BRL\  S0IatI kF}f^Z*cP ~9nԚ{c0#qCNlCKbruqq{9}X1aB*4ߗ+.ZKy~'St"JV/%\=A%9Jel RY()ւ<cFL`< DV6'q)c{-)F"Q(By!Qvئ[`PR8J4K()@n-eyfD<մ$`PxmiGglVXRh `,|3ߌSBܫ ^k#P8̀$).XI[d9!+yBr ?T䈈ZEdwBFJ EF S@x i kŘPh0?YCňD!M຀|u,CVF vgr(W0 %HEM.5e{$-Ak8`n Wna_<>]N sky;)3oNOp?x޿`'y0Rc; >9>7vA\}2r1TFa.=W-^]t nɵ#+dF'];pÍj U+H@y:wσ{q9zut/\K CitNyx?=(kr'`?s@LAĵ`[=-s!*l#Z䎟3BI8<[r8f4lO;)Č#Q b_6awjk4x|;"TaHa,U-|trH|FԜEڞ?%1zWŮ~;כTln3(-s ( GAVPiDm+o؜6<a֙8t-s Ii$y$k%L9!vUe;<,BpIlO Lw};~SzL쩗6t,VbICL|kf1Glti f3xE oQ<[/ _K:J^j:Z2)´]/榦q% bX;vHG+JEKI]ovro>=y(Q6uIk_1|j`ne;WÖX!_%#:>ݼU :053"ޝ)@D\ÿC*!ډ̗AѾۉ%ƤӘt~OIv"V`P:\[ ?n!(s;eB |v"eɝ|ofXv:TL9 +&4ِz)6RWnOsYxgA>Nӯؼe!^E&_ :?{WƑ /^,ĺE!q6&fr#ϋ:)̀mf5x4.ՍtUfVu:;&Z?YUNܽnN֚_*TGWuEnpW$[N/?u*ZrMS91֪zPԡx6͒ռw){/*e޹ד>O3D}2Kg BPit}~ tauw~ Dd,ٓ3ae&WbV{͹dz^ù1հQ05셽_1%{qB e,"7=Ogt>F12Rº "Y (5;(~wkf,xubax&9҅WĘd9 6$>exhFg([* G!kWU+饃vFݎ|TzS~)6_N.spNoaT(]qlp擴Ħq XV#}O+~JyϽW~Uz[BaVStn~XȑhMժ6TC4jYQkQbG A"@"#9 *a fT{'"mcTcj^j`öog\^QdH¬ϾsVM;ƲG ܩ L.o*MgtXtcpYZ"ԍuM:#oyXsǬ`.[ 78J@$2qʈPjj+s!+Vj]yM.\*!VK٭l[$VPpڽϖfH}HKe[$1 w߹L۰#7"btUtc]=[ rLt&cqt*Э\tkBDlJn}@RNMЭtwwtQh.a!GnE6U[VH-\q Xv2UQX)NRF")kP]lD5&1(4)xIW#WN#e$Y8(qu2 H}e5_ pF_o^:ŵ(:v-XȑhM ;*1RNMЭt qЭ\tkBDlvc_#UtSS=[ rLt&g;CbFsѭ 9r-)]{߫e=Մ eI bAT3ŃrI1" 3bJ  HDw@<T TsވjQ̩HoxF(q\`%54F}. Dj%){o!R}Je.k4zz)]% 9r-mO&wD{qT9SFt1:P ҝ[1ֆؔx#tӬvGt+A键"vw\tkBDKloOBؚjC[D˜FJ]@=3 trRKV}mscIT Ɗm[`,Iԁ#J#; &pϬO!g8꽍TSL){ 9[Z8,AdA]^26,MĦp'YA7¥t+A键z(twbFsѭ 9r-mo0\A7*^*AXFح, حح 9rڦ|7ʹ98d=7ϩS/ҨOdnto6S< 9mk>S< Z4U|^yB knFO1M̫L^V /nY!jK) X-B9ʇ#-wsYD 9qF$z3[a?ĕxf:= ޝwk>hjʌznޜ,\/糉q UĈQѦ~{o3[0|x@e?SߑR)??#g~Ri CD^ "  Bߟ@7<bw'S< |7<3w ~?`Ӄw" \ Xy/5LU_-VMu97SJ74k# |14ҙoEPnn/͕:!m$;)Zw|ჩm#|`Z12\]W>d{~O?Aee57&֨6p`ҜIieXդL)NI[)e,QI-hA% 㵯JUcVUkw YE*!Oj^K%zIm`UҪV9?20]>X.zUڍn_i7ҢN"KlLQ"{V6|TJ+~s_*iqg>T,ޓ!H[GO0D'jSj{`GɗV(gk򨔊{1\+sìE~17}W+XtC1V S4ƻ96*yFu>Q7q>o~ n]7O S}rzt\f00na֧>|Wab"3EKB+ 7Mw/QbnXmyυ)zdO]74 c v+&$#$9Ы9J}~tG~tGQ9"ీ_ZP QRR!+sSf!h*d_ /]y[[ uhTBZCS_x2`b,8EGoSQԈ:N'6P~X_ hɧqj6݈:\0.hxH@xjzß+!M$IiWLmrNY*PBJSa@ZQCJ ERc17=CVGJYާíwIbbCE>KY?\Of|> ƨ[l _h[ƃtb] aDF/^Pʙ9WZg`pB 50e/ɣ:8mjTx~j \4(>Dm "m۳@_= .nul0P7~J=xH34׆t}wd3y̪3?aV*^ _>VݠmNxҙ]IH1߱-MY;<^t21X1 iXJ,V$1)g +!6@mz`x15\1 .2e25 7"c@-L$hX%6 H ,ឰaXFS^0V䜵pŽS:!Ip#̈́t3*qm46"eܺ4=~`ƙϘjRJgFR̩^%)n8ON9be`H* wt̻ ] v*qCYʈQY^wJ:B@"B#  #\d@!` p/ uBs9Y-y, L2O[(}M3Ǜo]nf{_`jX{[Ƨ88W/q;~ wO4}eV7|hGo0mrWkX.aw,w0awmh$|V#pN9NA1%RUfIE$2 a8gtw8)N L@~غ5:*F/淋 ~oM?`V? @21AD\#r}01aY/ C#FvFG(:|ykM2xCy Da()&pKkb3N ^zz#s>Ҋ,H$GsDwP1<GC!:4G !] PSHT+yfĶ8ήTT,QC*\XB+, Bjg b Z b2h$~y{ ^Sܶ857QU +fE`b% X8ET[2HuiTg8,a-[#Hp$g%@q%rk,n2+2$FTC*86X5TDs'fwG5j\[d>#E5VlV?/(9XtX/R_ȡ(}k*!2̊3ң؊q7xdz!/:_cw$(Y.AR(RUIP. ŵCG`a dsi8N^^Ku$>0]+}lr|lqʇ?'&ڥJޏ102W|a2UQajOS! 8ə?ghL,^on)t+lhV|4x5]1H3>I15aM3 {"B^n"ԒYwv%m v35T;r;.+;AHeNBV4(Q~,wpyNƽ?nXiw]9UW#[퓺7X z]W6ֽXS__EP*]r.[  -/Dv_wbVE Zz(-n{ K}P?@[v2htISI!367N(z8!!JnנG+2'/9$'o<L涸cZ21h:V1hݠZx Ơ&<i< 3`բ4eoɕJ&uFM% YjYc&@}"(]ʺ}jO lj}:|9fd8? \]Пcui;8FdY 1/@5z_/4n#^`dr~M)#C\<2z{BbxaS$|u:pmP@IB$"UehΙA`]"M,N:bK@f#iM{ no;/4d8rJe)5$6Lq& U"4:촌b&@#^eN *dcawQj>}=UhJmTiUǶVhڶ65! Up S1[~.ÄLa s# e$;exd2\ Mx M[|l҄ j͙B8ݔXk8YSiiB4n}_Sa38)((E,4#ZeBbCRqX֌km3Zi]J;XV8AT')dVZŠ]*0 hD&j0Q(9s%xV3)Jk@0#".q*0nAXAĝ1ZTw#Pj$Z=WJʧqlr~;Xg>/‰uM9魮].az.uc-ni9;9Y ^~bŅwܟKsdI١W"LO.pLi]ߧwv9_ 7qW|rfFg>DTUOwZv A:ڭ;1b-U[#3AcJ$[9S&j c;qLTN;Z|){xH:n22tBQGuL,VnM|)ZNISCE/n22tBQGuL,VnM|)pBj7**C'u[[ڳiVnM|'ŒXdSD@ui ?5_@L\/Q8(Ո-$϶ɜhR"|ΐg&:Ÿm _oͻ]30۫S /R{5ϛśÌ 1w%.neO*)ܦʁĎ5  ~_/"ӋWf߶}}֥f7^-̺<x'Owq諬yo^-s-V4~⦩ :Ύ?xkZêpy*`|06 Jc.\:1x[~:"D G4;EɬdeK|@rTo`94uk 4u¸Y߿.3}5Y1bZ櫟H.q#8(2h[i>04bDY iy _-x 8_ޚCy-'.P~SL1:5`=^żOv}S*3]QQj]6 ,_R\6gy/y@ߟ_mgv~~83]ݼ;o%guOt׾]'.vjF֙s=M$4k.9wt44muCrjFۦM-tcy AQ =dN-W@bvEs`TcҶG^*,¶,%1ҹ#2+2Qwe,CDQ p]1p[zLԖ}P Tڵj!ۺk>8T.nH!탾J &m&ǵ:PiAcCTjXDC>j$QC\ e8ݽ21_DnZ /9 ,SD"?lBBXie~X EY:xDca-pcކCaUD0Eٻ6W TÁsDkDJO )!=3=s0v׼ ꪮ[WWxkK0 ?J?Pu_?d‘Nr&ئC/6=MrC /^*WcA]`X )g .BOr>ٵ>-OU~[f$6+8WUU-k{ re{7n:TMYJLڃtgӗlJh~[@ګsO84*5*W>~Yɫ7߿X l\͌1~=/*/I-xu'I;vlKV3h'Ei_j.fy|#.Ϋ8FA#nz1'pI;M;y=pq(!~qV r7#'qYk+"Ia8p[M" ,4i SR7hV-UDQ BmJ#=х0ڢc"T)M?yl]&zؗE&ո e,樲+$3.oL D<ʒHLZN(|9Y͗XY\=K&L&R,:II ʜ4YlT4RFmSeo7}46@*[߯x:7/7'l<%,PݮA>kUN\]6J3Ie>e>e>e>櫜̴('o$8NCRF < \_x;}yynmp@\",%Q<3H3Q Ϙ9i$c\D6u~k"_ʩy(z}N塶9P9v$\eїP=TjKR\LN-]>dOW`iJ/+ bW`zj,sA\W`1P6|1?>ijG0q P0Ԯ#nnw cmшyBi\؊ Ó$a&CfQ&%`znq,P?:aLCiQI`H,1U8Nx^\P Q]^Kӌn\P[ ;AY &i W~DV3cal¸1%օOYsych Tl<=R#)rFH8ruVǻC# 'd^کi4Q3P1,>0EcÙaJhӝN %W7;{j. ߚLK 9hDxQmsg*_'ǫfTs 7~ý!JRJIeY8EfQ %ƹupaubEl:+'lrn9{-ˍ`7̓m>MMJXWW}hвQAFo12qj,Drv[i#QI4KV$p*bO6\#R=lr>m{kZF f|ItDHl7$3{j<$ڴV)HRQpnKTWM\ȦMA(.:V+euٝC݆Ejjz}Gh9`ul[75b]lʒmENUmUѺSekx[B[habXʺ1vi t\M#)tg"uG qatBVqX[F3͖/w0El(RXad Ɍâ#+t3g y!yLZ\,wz*`;P2xR=xus`5Bkq.zH~޷:DY(׸i]q(:5<ϒoI5dfxm6k2cqΟxߖ_ AYʹ=#Qݤ!9{Р43"C:AL.E6<`j :p㡉ܑ3 !s]]ER@2Ap"qμFUa#z)@)TyIR;rA  #dw@jTn EMS'Viif*IyOn/lëR\!AC7v>XW:h*Vd#3aQF^gt'Y:A$D u`q[2N:]>0*&FKa`vdgPۚ܌ԄΏ6y7$QBalӺ\;P!eP1T5' [NL~19Hd!صLD~Yw@ aI 48 ]AKl'A l=!BH"#m'MR\R阥M{QvKw#xh$JD A^l2TvJr.zew PtyM栎f MxQ:"aWꁞA׽ChIMmeoMwĔEx<8MDQwF5ytS&S65+E^5N`/ĤQLq=0+|soc!yf{@[7r[d^u[4#xmYAuڼRY]q5_tǯUaҔ_Qrc5ž*Nxiɨ˯|Xԅ:j~FZ #B SõTJ!*KpmVB&Jz`Bܗ;-Cq dђsqER/Ԟ!F@ΙثbGEvO"LQQgj.65JC6J1;lF()ȖTNgFO;YW9*z4:g NСzoy>4p-Oy\M^J4*c1/{Q"ߒqǭn u fԆs2zt] Dpmr 7Rr8Eh@r#F힃dQA/Mʄ!S2ƸQ[kFӂ_\'woLBa9mXvÐ#=n\dHI=R[v֫z,iP!:xYȰywpR0uw¾X@9!!2nW{ ;\]]?z0q*% @3tALPhIWϹ3HAPW}q,q3Frk!V=tSBwz3D!iBD{h%N;8c',㧠#HQT,;-L*Fmx,m$yd@ͰN4j PHEaskhF{f*`]Y!L 㗀a(|E5ױҠu[a5r`3_;0jTmOm7N/i>mF+jo6H#b1#DqeWaaR*yXyZc ˮZ2& QߩNeti/5[|z!Owp pЁÉ;M8}#x6;;^ad6h&W޷+'ÕBqJ(g\1"l~)0-QPr W!m`~S_Q;Yp^tߜ?ie߶NT( 3UT;@itvYE$5Oђ}}ЙF׿ei3H}Z":qĎ5~:Qn>=IWdVe]}y}; 3bg+[( KQVyoǼ{o3v{ #LN >clb L\o-/ Tm߱'~LsEۓwaNuQfRDz; 3 =H8 DAnjy^ C;q(ќIG{ԶG;\Z6.##F.wt;2i&@jZ|㔭?h-}D0"wThE :ZhC(|`D4[ڴT^ZaFKyd@EC*=5Q*Ц֔顨4ǡF5B0 M1MQ A0zm;0>&@V7_.wX!DzDC8+6*V&Ti~U|Y "?$m'b= _ڮcD d!^gʯVLqoF@ZSm_n7@83ໂ(3 v%i=#l?nY6ѓI>8mNH~7ryJ=K.לU56?ʞa?(Pc> ;zt>GoG56e?W5?ᮊ@V{V0pd5\HT\HTPqQ蹏aIG\EqG%C9#mW-b%oǻx`yͯ;T9:Vs/2DË; 0EQgYJ}@MЃw[1\ce8FWǐ tR8m{`G^SA"U_Z(10FbkVHѻ۹!ւ9NfE5 dJٱ|pZՅrY<޺[VA19ٞk YrqԾ(vwZ6C v^-YZ`&~.˒8]Y! (%\9yL{uB|^Z9[]hUN@C5,MHv <-V{3e/) a dPT0 a`G=)&Cmuz&ggB% (Q 1L!< C%TDJh@%"PS CI Xay_A]s, ǝ5,)0hP_Xn{>|r`I@ p-S59!њ ׾"A!=ɪ"o5fU]ٞp9"IQ1ߡmNpYVpfV}bL݉kӬᤋszhN>P̣]/WρsD;͂r1GOhI:Bİ޲ H'Ѭ|pEV(؜ LQ>.)uԜtB@ӰS̺hƁr{^u:웵e-75rC3v2CU_x %lQz5O,m=)\ٮחN)5HCNJ^ԠXH!N9CȒ4BJ:mزi?<$7I% WWֲ=3S=B;.> os@s}whwqmnz`ss193KuVA:ytu*T׉kk#r/LaS 7?y.* %M+ewρ?6/J\@,ݑyL6MKR"f K[[p*i!f;$a"Vq@2KE6 >jj$ 9;TĿ4__[=s&ca^.qZ\+"N~4[(kG)\I :I 8X].^qTXլ8ϽU~lei dhͺEϽsڞ΍EۯO(uߺ~=YvU]zK^C W<}xvq$OX*$5>A I/ bi'.c{{_/<̙N:w:@!oet }*oJc}`o6nFZ۳3YS-A:Ŧ\:!@8qvW /qyMoA/EYkFOFtavf ^e)+\L>>ak@Qѩ|uW?zUϢ@=-ȈY_R;f$Eef81[9LW8c{O(>b% E'b `LsGf:2zڡu-_-z2)tM^^&K~9JASSyݐKZ-7fȖZ={gftMbl0ߤ02XFP]~ZO7.ԦVq[@btD T D2J`9f5n5Ö6 ʼnXJ SB&|3=4O"If*F>!ǒIuc%*%aS!P 58 ' ("U!1(3jͪ`NK*?syɫCN:WR; >svPWJ\g$!2Q6$_Ʒ:5^y6sz=^ܚ^rqcY('sp\ rŁ!#P@Dۄ3.h4e(^ABBe-hd"} E$d_tlf -EÑ*8W*sG5՗<$@h d_ o|H +/ {~3{'T._# v $d<6Q 3 ~ l%OE4T=!v4VP-$xltD@u 0( L)DC4u1:检O1f'0W-b_UB&h=#mٖA+)>xԶ4t=xޯ&|CW];teUm_V/ BI 9!VJ(#(P<"!Ca U1EssjLG1=S~<30kzm~Ĺ>67,rxcUK).{4V%1m"%%1fK}UZ̤bT01B `q*bB"#\'CVŬ%WuyR r._L)Y'iYdpne(1E$w ! IGEڳ5[_۟}|NŐQ |?=N_R`nƚۻ^0 ?g@rm| y';`l1[r+/yC` m)Dto{LNM'̩b,y'ՋęR*uTm ӴsR϶oG8ai}* dz(wrmS9.z&-崐v̤tc.Ck뵟 `bn՛3q1z kmbOf1}j`QX_fyљ1&P`,)(Vf[jiLuʩ.[i΋'5ZF)r'=8Lll| d;Yk06SQ;lr."|:BRl-vPEߢPێ9jY|Ud פFbQ>5gNB[YXJ#!Povy÷cThؘᦵ:F(Z@?$`u|XbhjדobsdrgR{{XPB{3-be/iN\!n4p4@z(lm;TB}pm;H 5Hmڕ?;dM@PM|N}U~ά 6$6M;_Ʒ zio׳ŭ6CrQ 'mn|8A V,qTT|qipS!>s: Cj  &^ "h|D;'b{K խ26 C%{۩ͽYVfeas{VR/yNYl4i~s;| !ު~Fk{jS~wR}TYNۍ)u*OSWgNg\xz||Z!UlI䥋/BջU/'C>wv4r.@3GÊVJ0p0ȗ?DjIכd=M1%N1"C)tq cܟ:/zbPe2PuTuo F _uYDgD(絨If,w3YK=MtradL^2}'zU }=;\{7#`,M q-Lb nPӄ 8:EJ++a"ILJ,a[QOWȤ9ĥ˽p5Wv8CR`X8 =Co4 x'e -!ҎwB $&l3(4ջh9xlUFَmI?oRU(s`LDS:fpQH'O^NsfNnyY"!W) *w&F:qV LM 팂E%Pǻ5Rgnt7<@L[-&f:_wVZ$2BBn2$&f(n3cK(x!SSNjf;_5 n3Ə^ Bm|B:A\,g^n놠/I$ Z,JQ.ZBbv4V9iƨ̡J"]* JpUzh6qQ̗5yihFEmA'p1JHJX"/[Md-G_b~pޝտ@/0^2J!dm|nAl쪩O/bYo4zi|[ 4V*1᮵LTB8pBa*91),0"M9'VAke}5LN1Yh#"̟>~9Q<3R}%;?Zě e-=l(0~*V;Sam,"0g4ltX>sib7ɧMJ25f>Hs%Mm4)d^ʝ[Re/Ab VI$ 8kvVJoj84X@q:J⦣"aFϝ]bV[di-6rd DT!PHִe\pYS65NM t9L6oU u(Kp.h (ق'07d4]l3ca Hd-hx @W o Q+$z?5f)~QJel0%;YA/89}jYiwdZ:^I"UNiO1V:=j!S IңVS8c夝.{ηx19~)ו|I DXwhX6V:~e)ˆFө|, T.]$ߞ>G/ݡ/!)n2z7%n*w8 P ްP"F< ??b K>(^̪@cw~T`YT 5R\^luPpa/lD ~5kdH/V.8?z *쌭p@׉46ǿG#DC *Qmu-3"LFjH(S)tp4q7SiTati5&SFJ["-S$J4LQp[L*0NkÁ!Ȅ֒.%A).imW7kVUL1B#'Xf Ǜ/(`_i]BW"¾=F ªĉvYb4{tGEĭq L0{x/6yrOggݡ$;43Ζ53qFϔ?ؑ$ItpؠB$Q w@oF}܏gsa>d37&;F NW]O~?}?I6 :8S)|4kx[b CЋ >)(l+z;__-`$i.WUl-:zsBWkI..f#)n:b54 mk*x^8 $qJ2_Ư\Lpb{P)mz\uA!o+\6:u R&zGD$ݗ{/ SLJJ7!deRy'^"Eo|D5l맩K/bYjozR{[ 4ȭHspԲR*A0Er@t l6JN m}5y"u;F*^Ȗ?vŚ #T!-""%cı"$JSkaY!4 U*K~n)ZLuRnqدN,/sFy̽Jp(6?hJn=:==(BWn,%_㟙<ʗyh4Az5 Vu de kk qwGK4ǂύT$y|Q\:{Q5YU>% u^!qoRG0A&a 5ZSM-MS`wHm3R-naTbRyW#զ(ڭ%Fk [3kb3U3zl9ܢB#WMb#Mc-V2j GAۜX`(X9sWsfr&vŃ~Br5|qboj@%zk^ Nn{9^geG}ts?p .R[`f=iX,5}=djutowG6eA;V4Ӣk xJaD?<>nv z;z/}mT,Y\:r~xً֝ø{'@IFb/[wa+ #$=Ct}I F1qY5M _tmѦi\Y)_9^F[&}律#ں60u`4˚X` dv0uG΁1 \2Q dqt(G|yb=rHqaLN'z̑fNn!)^l@fÉdK9Xtx2&+ͱgXm2⋏eO*rr0As ]RU 6)4PMTST&iM!F9pM ؊rAJ#jZ/ 6IҲyKMNhOVVݛ?B"BI:y5!xD>ӢafF^2}"H^/^7{LWPv\E Vɢ䠻\Ek*"@b!Uۀ:n/][F+B?M_ el@fO)AUV:ڎSR_;)H0`%6έn猧1ԞV86cJ =78ſ':_zJIvFՄ"`VGX?M!CYa%z=-@9+ aR$yҊ Wdj4ַq2c|D毟+ w0 hfcײ_ll}\e{q5:CxY̦Oٟ|^n5jr>- zZHެW\ ΂v6[/_~bYԎ*F@nԮt\EuIDنxj4woP2p餶ߦZ1 B6`$羿JпYLJw?ss*f93 1 ^ex>'{^Ȗi/d;?+Va_cP,8dbp汱)GmeK*VR%ˆ*eu`]w?//tJX uYFg@6ό1"`5ف6 !H "WR"W7UbO"R->j|P %N>PRĔ FQZdԆpN$֨:A0c=[}x<4/N\pEHdW(K_Faw8\sԻ_9بrl&/|0wuɇSZ*SWeU ? F]X|,+<1ؼm 6of "LZf98XΉz+H/9 !,RZU6~(XbPc1P?1HAm oAzQ9Ҏ\aƑC˗+dN@$V ʃubPJiP-|<F5Eg 2R9k"yzC{M`y0DL^ Ih幍ŭ7pY1xdr5"@ZHRɝa2 ̦`B0LaC$Hⷅ{l/|qQi`e &/f4N2@.4&d٧OлC&gqb1V|.C2jsÔш[ 8I b#\,ĹBNeC>=lѕ9hnFfJ  MI`DZA-LFʳ &q&q͗wVqw:.uHn h,TykLrKmu|*4CcHzFd“#hsrvEw §ɀ qfq&.4gz$&`?Gf_,u7ӡ>Ӡ]Q-#!7qCc}lчޠ#ed}iH6RيR Yʻ~$3)?kpA@ft%"-RȝʰX۲__m\?,/6~D\8~BM̐$񌃶`E XBS5[|)%K)nJh0YEم^aBȌq?iv딡 1ex@&2GdVY~Rw:_пq*֮+"g*NJ{(X!<Kd5(ihPB Я)Һ䥐:䵱\ V2]ZCU_f|rk}b?hau=VD^FJ&nЮV˺. oZo6 aDb3bO-𾕰mXmKrr\Jd(Ni3{GୄXA(9MU#q8qØX􊻜c\NvqU&4oxSDX}A#óG!XJ,X.*稾*5m:hkjin߻k:ώc-*8邊ZCLa+bbWcʅ_>>D.89w{вzq!!ݑ9w@Q/(ilxˌ4cycD I/ߗKv֢F\Jq쒃9su8*쐣󆳇dO!k";ќ=CoÐ&= !sb5P͐hRGC(Tf F "4:25A xqVmݙ5 Yڛ/4s֨b];s~ɜƔPbųB%iS$Mb$MKcO8:Lq(~?y,X<*hw#97!˱U 8 X6.aفSbW_B~:Y"a\z_B)q1h8᧟_;w2ʱEV۪|>~Xf=EM* poJ.U1O/AceeɁL _2+|6h]z-\4?\w5cJ>ۭ\4Oň`SļD1YܥP_4SdX|],eVkݗ̉"e yvtxlт)%3Yz$XxyEy˚Eێ)͈j;m*.l9 ij"\6#_2M6})֖?O/av|՗H1K{R/ݢeprˋ$R^s2ٲqǗ˼@PǁZptp\ Yd0oW*^HD)gb ?gVk{ .iw7#ĚW#`xg"ftxS&4㟏K?_k%O1.7ףN{ہчt 6Z燱_3y'YХuiw{>әs:BY~zf^#H떧vFG!%k93+iƾ}LdiKSglfh_':GGip܅B,Jtv+MBDBAGX~`cn݀8xߺG Q46A|ul(I^F=<Ԟt«fNU\ݻ4uWHjݝ  vI`5OPҽj)Z-(x0ֵ.F2ZIIٽ!*%cc*ۭ\pwrCm;ѳocS/o4EH ۹}߃iBݨ%}%†ֳ ez։(S:V{Cп'5J 8!hn^__ْ?$JH;lTԗtIHˆe'甥4qC !Y9ND{nz09`g >ӠC\0]䐴8 (4Y{\^_U (JDw zIRi)pv?غa $h Uиi2ð"~"g 'BXW%Mcp77[RyS6JA9\Q9|Ŗ1%?+dxZ3 N.LK)۾"U`FO>m7J{A\iT`R3Z˱{" ns I\*.,+RN>7AH4kz tYESş%xӉN#7(/dQ՚Iߤ@hI;E0F 3uvjBf:~|?œM詯d5-4P2Oyr<;vsD͙3dI>[r]<߼nɵ/ߍ~oXV(-O~4{\><._q*\>D^#)͝7WFnl&e0|]T?@51,dv澾2)3}>\}3;qcԬUs FNd,`iU0PoWLFu|O_Y$c<%G'>Ş*cF)y5QW;[@Y+@`cB)%H $0"!4R/ btcΤ*RZiȹTykhnQWheݕ3̏_ 9l#J^f|k__pm),0r0jp ,ef5@jXDȲ6'2k**5o[p:m{)wغ6,z,H?!TA V& ]J '_?ï; q  , rk/)'-cY`$B+JLH}Z'܁Ak}P2j$_-2(N)P"e|pQdi7 tX`ucc8C;Ŕ"836&lCY:cBp̢{=!tZ jpFPM8'X(8pA$8!S@(1$cفRs"&I(,5hTvۮ3NldeȮoUӑ a4~ΐUJH$KqUn{ae%1J eaH~܇H(xKf/2O;Z2|!F9SKD0.-SqRvrAoqeހ!f y88$F܈dwsA3֔VJ[I n`-!sOjAB`B4X$ 3QC̬md1Fh$CjUNgJj!px2"b\4+VJ$`@l46bh J+tcUkiJ@j18iMQ*=Fht ]z D.Saw7|LR+l RbqYR$ ϘItna\^H~<9q%Uv>A[2N/Yd )rMnm ,c9cόJ).L*b3h(:(ݻnԡN3n%J7_ݚ`3hN 4C[.um2Y)Nm H-5:(;(k5F+yyQG95 5' +0\i> rN£)R݆HW0ʄPӇ0x!A!  KENbaЄTQNaI [ 23uNFQ.F63?]L.\oT|\R4΁oߐ[[[?va21wëADi/F_?ś_`P:OFjv@[Nό'_Ϗ>|> ART`a?@a"*>{=apO-a5#:LC)l\Hrs Goxx8 21 fZLG\#FйG>A:bd-YZί0Sek(J'ߤTX1S >e1;oլVvZUo[%whY#3~_x Ƃq1 i`ܯe5]h|v^!ocx[Q/ d)|v_ToLչj%ρj$9 B4Ԭ# .Tk"w.ЌI&TeVp:_T8rA:pvqm$R[]u`Rp䂧YMI:Sh@j*T v1ރ6uڟc# h+)wos$}-K^:qi-h=kd+>"o&j"//;&pR; /ṳU%MAyۯIMޒ^Sly4:{7iM!uˡj`t xLwS{,^by^Lblrl@hg]!oKaQ-ân5&ʷDqQZ #"ƝcD RGcE#rIC"l&rʐZyJG]W\MS쟳 ,XXcG%v|{uaLU)>SX!PP9屵$DA *`C!J1v0V8%T ui2&Xw`P8 *UPb09pf-_g /a2S>؛q GhBc Wbb$}4l$*+[N'^i'ZiQ8Of9S|fiN>~@暚RABy M:@3iʹ+np|q@ w]׻] cHq: {oNvR: ?j#4oi`p[pӤwǏA=WIҜU#R{iR<V+mm~ӐfcNwpbRvjL(x,8D"nBr+#4LKaNR֑&=8T+~r')McwG4=Y]rOX }P/4\)VR.VD w>=WEiwIu[&5|"h^h,7b<gY )ٓK{JJ㝣ѯFRWܧ"^K F'HڢUR(QYxCTRj`p\YM_x ::sH`-ohO 6FPCOoTs].BSհlAZ##VlNJQ\'P +t8HŖzirs^l0J&5&(ClHZy V!c"jylZaSB<œE_YV`ŞegcJW Hb"1a,.##I, # 08")fA =ecJ6^iʈ>djsSG?^lȶ1ޘzӿ;q1C  1{2@;BPGLn*]>7'X@ `x9쁯T2Ӛ zS|>ṝUo۟]|/?>$,X@'RRQmUY*jB͎T{WE3NGa6Y¢B(E^_.2pm&8JdS6 '5೴~Fٌ4n4!*+v%,YJ'u+5':Tv ]$%\(7KqY*(vʥI`cY gCK/\>,^h ͜ZXsPsyu=د'+K붶#m61aDVK.-vWŋWo^q]EߠJ)P*N{,q-^|pː= |"~/HFS6࿀lNOŨdCVZ[,4zQ. [k4uQLǍWG-d^|Hw}geNbh0CM檹8O +"[53/,J2ZcQA&V#ϸϗJ6΂!ɚD!kYO+]O3kR2"1t?!WdfbE[ sÞ~D7GJ@]|ZjIg܃׶} 666J ܨ;Z! NKW.!Ŕ@ZYRr&yԴwjR O~<}e1 HӡNIMg?|gF"J3z9Z{:ȵdh DwRE^J0VJB^[IM')n>rZekknVEĩ [/nNJαˎO^R࢕-2Iw}=( ɡ\H*jiht})MVpaF&ZkiE03(b8pS5y9Jϥow @>V'YZ0#v 8nc8>DU^Cpy ٭IҠ2S10v~xmI_̮<ja>ona`#k`]Ln ڇч|zh+2癷.GAEj Ӛi& ͋  No S A/szQ/Ni2`ay6ߢcUW"\Ex _- WP'B;˙2[#V`"T1|`_ltZF52.*~+TL>ݮi^菭H`;dz BԌ$(MyO-w Jktv5zoIx02%5`Dk]>hQj2)WY MaAdo"gK$ຳdo㧟WْWe$Fx+4;&rқw^zl!?ISg/J s4=+bwJs䒛Xqp),S˧<qfL^̮kg׋(v L$2 [,r´Nu+? BnL1oζCUkCT!ԫ{7Um7jE;5zҽQ>WC|:.هƲm~fuU*Aaݨѱu֛y8tQ2xT;\^/\U[vF;eS|#9q<܏gJA8pt{Fgl#GǥC?-0c/g, ]W^[TheiȪFF б1!tukƭ3tHŊZ%00e_ s" . ٽN )o6](% @VSH:PʤP!@C k20A=d.VP[jPK!bsk?}_oI8xk`.B {P{H z-14Ձ_?$F m> aCt!PgiMVS3R69o|-@/%=,u \˞ݵF*lc}Jb Io? UEhD֤Pq94EBa7(s ߅:`4RRzC&@()xO@Vj ,# *܏_B1RnRu3ұf.ce rj,묠 HQynFP@qA^š@lNLr߬/j7k 2;\"I!]@SƟj\ܜf )92.t h0]V JEhѴ1D+;P( '85:0(\+FKߡZ 2SÞVӐȶ*~otRL5O(sΕ*@- e^L\H#R8ʼnS[xIJgٷeXdJx '9'RCDUn&A0 0 A h"hƲ`Zq2uɽ:SG8H鮆]>_p)GI ml1&2NN_X:Zd`w5*&۪ J{-iՈԩQY+^ NBP7x_>bZZȅRW&iԁxK \KzGR"XޘAh'> i]sa()C@wZNڄ >_߽=UDyI9VdQ&3>()^v]<eg49qŠ u,ԹhI^Q计D'SNrfoAhCó%y3ZJ ruyd/m8|v^Db^s[̏3]d7 }&sV ;[|>{7>++Yϖ/ЕG;_g|УOvݕc{^٠گ69Gi=AoUU*gF39mY@ʿ,?x R}//_ic?]/E,eZޠ$m辻^K*k?[qOSh6_Ƞ@QCNιo n)s#ΐ\asJ$$*"#bv?Qݵ8jJhhJcN7VoEX2lz{;xh `y,|vo5۳O!s?ÍC6sk|f3l*m߱WQ`נ B9BJdkJ.Y5af%`'x36̑t<0<^p"pe;8w>7Mc W?‡k}vo3w=rFgR܀3TgD %b)(RPm['\w٧?Hz;F跠7 }ܯ:^!G`ׁU"*/3#}?W(*-5(؍ c]7C5P ~zND|̈́h%y_3+}VF \.ڼc  ,N`3;V0SIpW"c0|0/[#HMakMaIaN3̗k('kaS{?]9E*vc SA+gz17] U/&? H;pᒞ\8a֓QFNla'/ǫ| N B @esY{9o˚?l>Wv4+eҡMr;XaMύ)(V<8LaG@Qgy<#L0/ zu'pG+W)>s@0enӏߒP[3`j#zcw5Pݸ{Rɾ2%ffʡMi&h9* +gAC'bL;m4FzhE )e]g 8(y}:xN0X ذ"KG=SmHRѣtvs\09߂X`Ҙv3Aa38?{WH C/zH\ (=3똗u(pvkZ%jNEH-EjH_&BȜUykCAמ$dʱ)(v/_}WB)x'yߕVlYz_,|;qG9t&Boo+ޜ!񉦖IxrHJ2fk3V_+! #гx[Xjvjn1A0zI@x$"B)r!)u"_={ Y7t9a15UXU߬&"2(0[g<̀42x޳hFB1.!ڬWJ%dVIQ9jI9fh\n;7c&9iY&p$/*Q#,Y+&1DLH̴حPLFQXH>I8#[\&VΈ}[2W޽{o ~=2~G2 Bs#θno?3߿9"7fv'q=;#7igȽGW|JISK<٢IL T+M.₆*Z7VƸXhQzUv͘zF5@uOLTa%ҺOnDd68$*$G$Z-) K""Y#BJ5RC0DoI\`fL.LIQ, ݎt|{})%zz8D,VN hbqlnx|ϖ=i( w窝ˀ*lz\3]:٧? P*;_؏ȳyN &?Ũ%M3ߵDx^i9B4Lnx|]l)tªJV|s- ʸ}ULYUED/H=X!nc:トEF/>O>]vG\mgcr J _.IlI=HkնkJG /I];零dJyFQxt‡%ԩjssUöl`@0A1FZ'r?:̰uݽEb͘o{/Ng.-0ޠg'骾 UƑV>D,RABd͆gܷج;ܩyfcmWFnG1׾P$[(_(,\Ue7cn%/[&5ʏ'j4FKk*P۰YIܡjSc%mZy_EeNVf}QRzeB>wW=b}QUzQU>v/tP˙3g҆gz20SfPc,ppBэ_|5o!1/qyv1\~k#u~kox֋,z!%7XQ`KrD/CD?FCdZ*d hi~\v̴o7\Qs`orZnJC̶[SdZ9ݴhh4ܺr,>k 9<ؚ;םě\\Άl; j.*ދM{tL[e?z0s48G^,9e oK秳bHO&+W4}LLiz[h#Ѭ4o͟'Wʜ29nil-[#NbT$4AY?})lc,!͇``\u#`!O/læ&PUx$Wm2B_ov8VӓPn : #~fۂ92[+kȠ0z0DУ/'+V4 h{}OEY^ՓdkayNЇGwq;N_roeCFM18l}<-l? @7W?)IO~ջ 3:eIΊh2g@HT|HrkSWFAц>6Jk ރCef'|{M&ן/A^'>).IqO.vW , 9mKgu nxZ8GJ#V   #$v:a j5 *X=-%`MZPiDd#<]Ƭ&R2:,8'J5NVyH2ЧF# 4af+1SreWE᝴-jZ5_,]CQT!܅O4f)ɑfdwM. \9Rv_|:ͳf坋7ǓŨ,:M.ogWG!OGfwtplКw;}$,\Mq\(Hav_|{C eneAv NoL*"hd> ٗXӎA~TJ^sh V&DE聓pӪX7ߧ3gvgBO^er+ٹj[l [U"a݄igɯ)'"%![D1˴!_ c#"waClCӧmu]E7l꟰Gc2'oM`BN "GCsyx@D AP1EC8/5 X1֝_]: e3Х.dOuWi'Ϭmgjm.H~E-e9G{a\7&zE|Չ)r`fc%#2:(@̼Ђ6 ! LG%C02ps~I{, DjF)J$ȯ!`sp-=$T$ f.eYLA(@$ɸY<#y  Cu~3;"8_?maR`PSe/3gl Һ!nj:PRȲi mwZwK !0lY&,jlp2I,ḍ H: *'"A eo~=}s{6ߞvu`5 Ԇ#[tE,ŒRHۨՒu̒se#`Y9MM*jaWciG=2>"/B_kL;NqvѺSPo)WH6ةAM}ࠥ4`Msp # Sخ bUt)r $Ak{ow(+%kKQd3wNp@yOaP0% ql`VnUS4LteXfl}μ@CֹDQeHqK6!/I>:(*DcN}AfzA 7 ݪTn\r-6m\O MK )|TYR6[_2g,hRnR6脍% `8af^ ̤bNCA@1g*+TrJ@U9"\Tk.s0}*LAЄ$,$'d\ 3ºe&ǟlR- Qif=,&u̚l5ǤD ,@Dg93<¬rԐ樂s X2`#:Lj+gϝ$Eʐ-Do VGR2 LKIQOJ!'& )V1J҆@$WK^Q3kZ%QbZVN,Cq]]HF8pm-B2ٚp)"8n+eKip[G6I:rh:jI nl׌IzXkb$7Jθ(-VaƝ=ۋy}oI J-B]93ȭ幞RH~"7*kK!tVK(I &TmKqI7e>NGYE 5jr/=cQj> 7rT'}I})K&tf[z̐Wz^!E\Ҩݑ/a+mS,r|ioXOIGk-to] /yoEL6#zы9y $pW*t9ZZaWř0a~t0__). 4(?zwXc3#O]Y⛶UDڲZOCjN^p}y9N,4Ct֖\N7{:w%>7d \cbc##CL,62Bu w,jW =+ѓ"'EFOte{V\T)u#IhSH\ m)DERycH_{ȎUAgŇ>m9l3z9Cq\Ɗʒ(C7$\X{&=*{Tr.ʓKpU$$]ح嵟l7L P8 כi =jn{iNf p`Ĕd7a/x J|lZn/+dwJDp%#_;3944_ e2)=rBl=e퇿~ד KV}Q<^}ٹ|OqnJx7P#Ӧ-Y&.3h6<Ų{snÞ< 4[6Ŏ=GZ#'Ui.PRS? ֊`p2-ūI>,dIѓ"']&Gc{kmH/{ .p{$ L$%d{IJQ83* `{#s6jf)7; xhkm܅h0\l4naIV I߲ >9Gg0,7S+z4fF Sulz6Z]i=ǃU,5F1_+M"H>K\Gý.=~@HTXjNDHLWC\UCIN8pgE> Sff[ooGմ$NEjG$ୣ>q#?ᙏ J[G}T2%O?N~E}̚O?@|[Dj͞XES9#U{0RrH%<.cr;xn]4@v"7oG>PO< "RO0]}{;ho>md3P_N#K=?ٟJ*FvtïΊx+:,_ ":\~jͯ@MLRLNoEo+FM7rc$0 >v^x]ŶލgeA67v2ÝC2P֯~7wZZ_Kx5Fe3>y;;I%-хJ!}-ǘy3,C]JO|UkV.>$!p/Sii,lkE~UlYSO}FdIqCxi3ztJPKbɡZ zwoӷ}Zm:7_b,-ɞ3dGw?R %4Z<DKIkAȖR&T'23NI?ua 2PE)T^~M}PwOtHK9+#|PTTOFF٧OPm~aǿ6{w򿿾ɭ^Q,Q,Q,QIfpg)_2.D L\Sˬ4Hb/=EO)P ҃4IJ"#KOFՋ]ۼIZ>KsJۉ?ͦo錚7F/r\Ge%7]<Dþד.|4NQz,IxZz尾{6۵}|'a2I©Bb4z ~\XkU(D"X[mP. jicj&$yqD43r\2);B9)!R? l>=NhHYxt!{ 7F+>@\3ѨR)T2KPH*nwѯ[պ/UA?UݑIB5k/ڗ?=8 ^;q)d׵&TwpU?B/F<@+Sz,/V^ @l-h2Le47ەΪVtؐf404[7fltY@o>5J r9|g׵bu^LPyGIDj/EF7zLKSiG.<c߅gXjO}{lBrz/Ff^<{?K"~D,YSěpX5(H*aϔ&Z1pQV3} Qyǣ˂zFBY0r-:.SG6h6)}܄Mj]u%2sVZg^k5~Ƌz3(ͺG5/S*"/-*.e;]pvA&TjO-Fj3ccLN; wWԪ ^q6\)yz]ZٹOh"7/$1"'Ϲ#ү<$Pl?eEmՊ;g+ R7@'*Cct 5siJ1Mk\\E@[OA$TK }xk94:D.;rE7t=evM٩6\m0kW}3$SO5^pY[)8y[7TrS(:W-[v tefpvKs)MIdEQ!Sv⍥8UXIs5O0@n܂!iM7)u \/P&6G)}* H.a0^f}3hF|-14Rpt;5!Tp6dz,cV-JnQ;P t4zyni.@$EKн hO_!G3B[骏LYCV"hh}9x)-.p>4+" 卍Yϧn#s {ˀգE9776CiK+vROYh9YJMfI&w9xVwpٹ@We8t\uyw;dA`INNwhȌMp֧f|@)p}s:fFRUGYE\%U]G_I)vԛj j=FZHP2O ޙ4Z -3@2̰JD+pU1H]mYa]H͈(r loT3}P}_MG_^)if:BTn>-$N xj!k?% Ǡ2IƋtNnƬEqEG}Lù jh[tMJHX#Zti]˒j2r;,T!:*~t9LQpA>2)RF z(Ma_/]rtYCsÅ6n=NG㗅Kin@HѨ/}MPa ߘ-uxܗ} _fsdۋ 3cr$N:`t pR+(+r5EFh_)"rQ펰T59Bڝ\t4ۍh-sn<(2i.'xjuqU]1JEn#׈)_)9 ?s$nJ74=#d qU!VH n תڲ=%4f;Nj(RiRn;=ž6r.+;FxURX&I 6[3&~x֒0Uyʕ2(6$)m* صW; 4:c;BlJӄ+Vk͙* W6F-}!<]{kg"yi J<[ָ)x&E-0)M@ec1艝wyFG)Yi*,lWVJZ9.Q[P[l\qaGOwÃ&g%+)yy?g U6F_ PT`F[U-?wBk Y 8&CB8x*@<=>|K4pz; 5ÛE qԕUhWF]4ZAvOn^-vKYՀ ‏3XZSsYUеR*uhPٲQ8p-:VH&ks8ͦBgq+{iP1=e{e-˚ixX:2 ƈ6L͹X̷`5#TX{gn Zda[Oee)78 j3qڅ4mD4u0<DZn>K ! 1Zoio E^ X4c̰`9(Tc$Ha.cPpP\;.4mEA4UVB$_?\ f۟#xOzk_mߥTk5eL tTiM*cXY_Bu@bZ *>*2e\PQ@[laWtlOX ,yoRO4?9NZ3>x|/>o n\7'ƓWxQ h Ɉw)NlbGTwNZk@g/'ٟOOOWw'O;j7Kv=:$гe6ELi0yw\H `$-} .2ex#@5U^Bzt:GTwOm\_Q}QzĐn%)d Kۡ>ah$S-3gYO7%׾76zmF[V=Zl<{5w^ժMKq/ZI,j>W)Dbй\ę&)ϼJEnNi3.s.hxD Լhǟ>~,^ J+xbRMcszr}1;)1U ttXڄIDĩ'ihc!?9iݮ$9(}>X3i@0S`tCVjʼn"8ͅe%wq KZp rlune ^"";B!IzI L֘4#|F%`*BJ0RIjj /PZ N '.GA){BEH&Ot g ~ U'O\d:x|*x|T*DY+u}_jnYӴ16Ӥ\ULLc <Ѥ^j"@d1NaU Z0Dٺ2gKW)E.r' EXF A8|Jy!D%Jf(4|%JFU#QݖfgKdK?ٌemsNjx2>y\ρ&Ս5dLL:*fD `H,܉B)KR%UF{*hrG35/r:.x`%*xuT}]h^sRN}:vgSM1V]_&8f^^u=%|:ZKHUd|nhO>%w67B9=;>>*K"Vl&ʼn.qXc+d۫u!)!ru/%pt~ReDPNJcZ&k>>Oq<RX+jOhcDO+I~PJPzT :dU._3uŸ6}5Qu^a5(p5UX U)5>EH".#,UBH$4%Q̗zEBfjY0O-چ(eT\8.6ߒ\yr]f0@2)mrqDry늬˛\Woɵ!z zPL(uj@ b!q5n2@J>%$ӌq2KRc*eG-qĽpR$*Kɕ?\*xLp }'|;MIdTjҘb).$HtNPΝ}g)gcJ@<3V7i[_y&!\ZTPBWoXJ0L&Jʞa;)Ҫ?Y(sɟH{<5KMɬ BcrU_N_{w~wC6cP/$ a˹  +6R#a)1#yx"";x*xG~ cGA4Q2ٴ/v(_}l4^ׅĞ A pw$fR;5탱JNoABHCQ@8%pir).Fm^$cA0>Js16֞VUv >>bqy:[!k F!`fk|r~[]_ŧ}A>au{`^kK:=ߠ%䀩k9@kX9RziK>og+A=jUȓ82X(uǤtp+)`(?[qʘ06XlQv4ZДP(k.B"!ŊU4[6('vY(a=W6in\a":DY`A(J!Hw8w+zàKҏW!jv3}w:br<#_It9V[4 mD%z1gas~?фr)G5kJڦ*í' RB`X4΃: VD,#kjp|?}6GI &RV8nCbF&!`1.13]%pk`@vw _~o\|ûw3zNgwo77ЦhOIfv5;owww_ݙ^Cp?w+NΏ{F@C:v~LQ{~tfu;uw4pgGo{Lk;~< 'T(4&-t֏ݾ~ߌO F ໚!0Tiyw6g'ql̚޸M̕2~!"ܸ{#1`~k{<< ~{j}ݸ9<NWl Ss[3M?r }?80oyfl/NEl-/._2ýøbǖ-_LĘLտվ@l~~e071v./4=y*W{18kWӧw]= :S!ͽ^Ô;8˴px +; [t0YO< q6G]z _Wm^f}u]|8fYM^oo뺯|8|, 7.^Sc=W[6axw ^8 ~2v^wg_ޝfN7g,i9Lҭ^oyșd~7Yx{<f_a z[($ gNOy0!;]ݳ>韁?bNۉgwg}잞«)L?72};ύ,ύiw7{U?7n}e;VnQDSBJwX掱B9o+!J|Xku3E{Mf`b]psm1BK$NZz$6!`4ŔR "*MC86ı! ql"aGEcቡ4:& @+0®< ɱm愧2tiL(UZ1x& B"f.2 QbslqB,b nsBY]7A׏]o~][BB ^;.m`)$R|Y*5s R1%5 3L)C 3-1PDkaKCD `ek/ҿj}<2*MSyT5Bs?k4ҟZ u ^wjxC\SXK=c!z*0ɕ?\aLxdzFL^b%ꍤx}9YkIHل]1I~<5c:1H'R`a,]9pBYd*ƝuBRl<˜Fڀ BK]UmRW ԺGǬja'V-ĪXZBt@`-y¡ ljk=t8*l*+<:6TitTm|5P2kY / D>*JǔsRSk ubMN)ʏ`?5528KL)Z?!S ƵBS *<ƫ6TiWm|qܩb~G37TcfNyXj] D]l:o 64.m\59j}j\5q4U"T'!Y"jcqDHtGtw::B#qN40"ctF p jD0 G ̋J*6TTR NJR t(#שּׂ8#J Ea!C64ET!k<.JL2wc)U1=IP&#,@BQ[_ly\FNT]vEaY^ybi%^PMUw{QZ܋2J(D;)|UXHV@kEϔ12*l᫈{k]Rc^ r&H煁vw(@4Ñ-S ?+֧(Ă H#\\Koe7z@Ûc.VfW.2,u#Rkr4Z4B&{~]s)d^|NdcE>Q7ɧliyi˓ޓ|΅!61zN~_jU{@[aӴ\VrkmvW7jYch;LD<"Qq6i =Y?pe \qLG1採(4!y44GjM1Ql?KW͖#"җ;J)RJiRJRM!o$RZDzDBE[rfr,'X"qƏV)=Ͼ~:SVʍɂ4Ro30fL.}sM85.XB23-ӈ "֜qdh.lOӗ@k0e?^1ns([Eꪑn#ụ6\ܔ^Rm锎L;\v|vmM vO 0`vfQ<rx&"s`qFN \`5&adtpQgi)LR\+p_)RI}G|Ƹ!q$"iŌZ4M `RB?~LZU-t8PxK e'M)JEƔRB~}E-̶w6PNW-%)dcq#0FjϢ <3Dz)?J2# C3掷#Ci`4 =Hw;0.` F!Vi/DFH"R ~`VAc!p;#0B4mM%K[OY5SqlZ2E_T \Ff s%SXh,g0Qh-h8&c-q$ATIcW\}(P}ETA2CFrCzSX HJ9r4:e pML(Nn>h1+ǐW֓$B>@ `N* !E"aX@~-׳?<:#A\V3m7'}>M<3mZm )Ah2,bC$#H{(sS1p1x d)v4K$*igE+S)}o_(L bHQA 'g"Qoƒ op9AI%<&Klt<-' &w &RFOgykH æ?mNіYJ%,K A'odGBw :!̀2%3;iԁ >Dw@CPHb2O`{)oV-DQhTJ ʤANς 8CH8 dj2T`p5FsK=; Y7ps5E*sC9V+)aX% S䑓`9qN5V >€:A_(tĪ*N &QfSf0l}~#8 c^(V@V.!# cw|lz-2DZJ*wYRxq141 B5cʄN6 W9g f.PDyô8NcE~ ;5f3ilik@% ԻXqQUJ>Gs h7% `ɤmS1KҶn,s{.*%~l y0ZxB0E o7ޅmŮmJSA5c; fBM)7,c&xr -Ŗ.M6G#Fx]\?J zќ(NPvvtz&J-?&699 ˃LњTbÀj a0e>(M`@?*V#+10ZqP1HgQRviFv韧xsPJ;/ ˍ+&Iˏe#Dfנ\Έ\\ ^J@f_iVXo *s>[a'VcF4ħ5 ekiUtTivC#i1'<"M{HI*qr`qA~lcTWmM~✣Hi逵v-G}lZ5^.؆.ǘ75x8A%0jKu SɬV=)W,=kô$pɖ͛5ҭ]Lɍ.80)ф'ęlObBE}vuPE5GxUPO'.QPPx8,NP$利ǽ8q1k'|RK(>S~ '΋J$z ՃY1%T+ $ oNzysI/SFZj SZeY⯅S~ Q_+:UK!eAfΏ>eAzünǥO:@Mص;S+Qi67ocS/F)2}HŒ~i'Fo ɵVY?[Pt]~C} v>n>Po?~hpsCG^n,FvדF7޾wp}q}?wZ4gg_ֿVesq8nwwR$ҘT_-Loۻ#||S7D(FS%y4$nRF#0qx>{lX'?N1^wOB5}j/OM%hЂ^NەDuO ;0s9džW: u{0ypNRڋyMɃϾy[.|>@UU;G\?V߶{Zusy+os5O8zg/^y=|wFaۡiHjW'N6~t *wsapo`kN'ǣSM׸p+$:6N.N6jxT{?@O<;6ɩŵ xvm It7J0S'_&  S|^t80.7Z'>vr:?{yqmdY:[|n5q3́sy[oN~;h>gGfԒ5Ze>*H9 7n-;J$}lKu4/uj֍w+Rn &%cǿ3؊|[=` ̐b'RzpD' pFΣ\#bdQjxtic v"!eA+:]):]vޮ具6-R@yG.'>?v}͋ .Hob|sY{JA|]rs己@RTL kӟT*O{5ih=P0>,(}c kR~Aʯ9H55d0@%D)fA;Ù,:I,zMdIO(3V> P~V%6elG WWa'$M~g&:5KwSZIvcFa nUJsϳ_T8nܘ,H3,6c"  AdX! JP Zu uQ.H'*O_8KC_6*Wݑ,&֫eS{QQ8{{#2:(&eRu=;f ])F" QWE]bPLKD9l&G逧ZxfCEf,1 2!9HT\"2075o-KTD}anh2kĵ$ $R2C֫E]bPҕ }bsNjx?:;tȌ$Bez =0 0izD/Jc\]`CBVzfh *%3LhzY%jpFDOJ2 b! fS##,`n <9ldٻ8n$Wdb|XlHp~Xf[JHL/m; gU|Xo$rf {6̌Q8Ws`*gX uv)5IW14lhj‚lCР.cdYxR# k(kچC#̡!f\j&LU4=^*Q,OK|gTh,S\]|x}9'l(THN"qhx3^'j'J]ĘMR*D[~VȖNby dr`HzH.8=^ޞ=}^:YdlT௿*TgdJ]Ť|`vz%N¿Ѕs#]0pF:\*`݄92W+Ίsrԓ@-Ќx8u%5rRNg9>]8 G"mr5rk&U%Wxʸ8dSŦ˃ʡ%3`L"j"q βl}ILjƩ{XҰ(Zީ`YuIabrs2>eR\>g`*l9~WL*aAMʠs)R+tN 2ѩ:'=MN/P;rޑKlMvٓ!^&lvO72: k};J}ګu~Σp-^~ %y篴k%MU&~߼ھúi1ՠAyv}s{ga76 ,'W;M9Ԛ猬R^_.nN  ֻham\{ฺywyP]j'W{QKCw.r4۰ ";h:H #4l: JpH Of^LJ8It-$E0^Ȏ$hfuCЃ ɖ 9肤eh $Jb] GʳQC!ۮ1= $:Ƶ[(^s$@[$E,$pOdG6Fv '3vD^b[0SLmǘ^6n I^Oܿ€rY)rl~Xxa&ܱ?z=rI;[h?:iRq mrҜAN8A( ipێ19RFJؽVC TlJ.Xe$peu2 A%YR]BU;vɆ P ǞF%X9èdXq- {Ih?*9 TU>hRz3ö.X Aczࠃ?*&ʰG9v8wl|wC[4I$,ׇ}H$zXi`Cv6v@ҷ{5ow>A~Q|mR؅M({Pe 9+I,ZIq5$ ]VW6rdv$ Iv}x|, `HcLTF[ !I^|fgNxf3>3[Ѳ/q`.jx؀ 4 @4&p^'o2mvxmv8<ɒ9>.#~>[viə+tб5᱖K ڵvh9=B >X r(6CnP7V%s Rm75Bw ЭU7Y]Eiow ǟo`7g~ӯJk;SԾZn)+FTO{ǎ1s!5Za?˭_7h?5wJlN.Zhh2uMʯgf򍚠Xof5[eX\ -YF^u 6hdA1&l*k:=%kEhuvbb(\ BMgS2\qN'bIfM]W7NWQ&]IC8;,e9ɂOX5//݆lًߟ+n|~'\vƾVUbl*mpsfx*u?Uxrj6o=Q}WU睁?cܢ~PIt  :+۫iPbkwzޖ|WzU|L\r_?r_j3_(Ms13OV;miFG[[O?5mow΂B܍ ;7TO?ituun}c3?lDӬZb)p&+A*NGrly~cs S,wH*'(ޅ^>]?|,'K7Iݨ~h-m#֘.2Uam 芌9K>{%flQQsAV668(NJ*-IQR3;vG7L.;\DB2KXg*6-Un6`+K)T!U!A*_&^iD]<4M9%@l!&G)HN$ȠG\mO2r_OBF*gMYۃ*MMHM2նoM*i(Ѥbf1ƨ9'^7uq hqQ)¥ rh\^z\/m07s BM(iàQb󂕂d84u/Rh2{W{&g (a4w Ϊ}hrYA3\ErZY3 ^;~][>-7{b.RMlVA쌩vÑ139;2#,Q!bI΁W-w^)ZX7JR&A1O=A*0K|Qd%/Jd|lHZ$1JI 8aAM/k->qmTJ'iC%\rwmᵄ!pKI1I!3nf{U=A$ 3gxgC=y|_l8"o)-)SP7K %YׅTt4@;)vA9ZbƁkr\UƷbq / g;"TPL:ȋkصAPF#n  |=0;ȴ,b.Ph;Ӓ՚ Owv )o&1+2hRw*Tl98@Hj~K輅FdzЈ8d0jB4Ahd?8zǑÍF#ʝ f^oD {WX7ƃ\r連h#WxنԚP1#ײwᙓ v-јHKJՇLvucl"$؅J6'<~HHvѥJpz%lǘv Io"@P!<\ 8lw4*2N > <+ wNY @f~m'" p"%]$t\FV,$91iAIX\OHb-t|y {visIbƜL8PNaH CpّJ%$tmEJZV ˴%;3m|4!6(:k^1yۮ1I~8U:\ɺI /TTCJ&.>Ҁ\ޫo@f5?eךO[BaV@^w!~شpz;܇$)T}ongc+o9iڮ1sL}j ,̱8:ϴ$lk9hS0K*b6ۃP;B'L}\lPFˑFL)EA+[̶4ł%d[ fBBഈ5;3s|jzfFD뀽yn' ozd;% _]UMYL$wcn Y^ XfU*"!F#_zӛG:JH;ꇶ.b%͆1rUiqngnOC?ݞtsvScG#dZ}j'o'Hv"v)P(VPȲ ޘM r1LQ)8@KވIш֨~YMAxB =P"5_} C&_C 2#Uh܇DlᙢjvВc>>>.?DOG>vY<7Y%OVR`V$ԱN/)rkb%@?\L {,ZNb~e[)J3N󐇇%A5F}mf{lÇ8̘ SuS%N},]`Ts<Œ9`1" Z MN5EV~2hg /TI-?=MgpChNk{+ΜS.Kt%:uylt7^ͧ)9ʯfR[eGP, 8i?BL9NU S3z3 űRL M4f1(P4gZ`?6K̶BF:95Ʌ1 l23(s0a'$ɵ2sSTf$ٔFӇ6/h/[J@UlmX7sZLŨ1ån lT7 I'(rF`l_zÂq3{% 3 9ټ$1!c  @[.Txa-<$ivcC#$'fcRgzJnL6Nʦڒ12;P zlAƶ7Of"lvCu |8~kd=g٣q>%㯇2b%Pa)h#{|z̎4,Ξw%?m -Έo^}neF=M `,_~?MYyA]ʾi Vˎ[wuA r-{z}\.RK.Y|{;f퟉"R*H\le*dn󧧥@"B=aQK61LSڈLDTfAҫ*tbI/uYftչ|'_ ɿ-}mO&ʭɽHkE g$2|X/it{j1By]. [DWGa{7f'f*o]N>Qg$u' 8m"r1HD%(c(#@3xryY)^CpY)#"y=n('p>"GY/oŤH!SEQ!U?3ޣ" XOIE'#`8e#B}ܦqgJå#'ʦp4A[.=0jjmPqug՝wjwtuĤv:Iny 5ԛߺ7=7s}ӷYLVk7?XL@)P+%fP-YnXVn]v1?P\żP`)aiw'@s@IɜY*B&!s mFyM~,Dx?uEn40uxq>z. 8XK:GZR$Ji0s¹H! BT!NLt?ݙ=ʒyCïs;%&E<\DKɈiG C/HmnxDXr^ݻ%P1*R"0k+ȅJ ӇC =!C]('%'j*{흻wb]Oo`RrP]qI[-GΊ 1jxXGItp)Y?yW6q/I7m=)9e 8@+Y"HFVtw2~f*/gS'G96^q"zUw#9Y9̲QGHB&Y,'wC]?Mw(FN"r1/*sS>룻L3MPL)}dŝDWVCC,t?2 C(^>,kY$Ɯque:Ge‡1D:=X݃=XCCW qs9#"1 mJ:S*IeNQA(aRs!xM(/SŕSӘ]hitiUVc>XE`CQ QgiTs$qqf#J$$1kĨ* `T8Oj4>i(gO_&Y#IРnVSI*0 F]9t>ϛRy gT&Řݽ#;PֈT%(IR"xI`$?+9]/=szl1/V}xuhh {|^܍DW bO}!}"EULt<hd<#ޕ~O`7̛DYt[t}R=|hrؗKQe7ZTONU1=tV4R-{^|K#D{Mn[Va/w.F`IE;u2<FX =ZA%h ~*RoQ/Æ X =k P\C^%$]y2m5zSԽş\` ;]u--Z+,:8ģܥ\+)73{;?( 0:Y3@%0-dUtփ[OVx$,OyZLyzJ)WW_?>;s[/~=X'lۦ}{nz ^;ꣶ޶Yd[E6:Ȣb+26jQw"g(NFlBj .JđB Н$YC#HFp<u qVA~(\bݥV"lWxjV@IVC23BY"FMd`uɏ/L^KX.CU&x(y-Ye]yh"D~zL&,t%fQNR*BzIHØHIm,,ugLR")T1dy%*"̰HU? Sr `3~<BI;Zlu'q;l@\mj%{6z{h$ I)9 *) p*9 dΫfNRF4d WG2I }#q+ȧz b&q,( 4px8GL |\}$ )tGk9*@Bm0HE\RN8"!wE3~kN-Y;y Nx~rw6IVRHt=0RB84BuT8S*BڠȤJ+ξ ˀf SG|yƗ6(Z2;랺p/6 翾~w?ygOw!fƜv>+lL80֭'6k!0t-^Gvb- ?QX_wφ+0ԍ;Mh ^e6XspW.N+BpRhMpTIhᦢ5j1/@D/Jud @d8 q+4Xu>>/y#Ir+ fo{ œf=IŮm1Y,H޻IHd9TIH+!Eq1*Uke[:.tjRL e=3n[(vz# Җ)+RRp_cSP걑B`mG& !j( %X{.AJPE  i9.r@`$ \ ]PHxŧ2n$_o, Q{K!zW%j$pPQlښHe,7T&cU< ULpx 9 [ +Һ[ qcoޑ hREYEWEV[ {kgQq# ky$!Y;joT(Y4NeQ(bFxFoLau s^π*[ U,P*`dҜ =hB81tq'@b,¨&9܏6a(&c{ww}x2ZyeHsJ 5e )d !U ooA\K3z ϰm]9ԯl~P\a4YdW#,9TK.)@ދN=}Y,ג660=jlvFCw |UMp^܍ЖK'i%Z{`[¨1%}'T@m/VVm.yD"bH(A+p0]#v>|/la IȉˮeٌFrK's~N$wmEoSɾG=i\}Y:%GF%%-2EբofgggvgK UwgU*m3 ?0ƅv卌۸?<٦Q Maս-x{IȐ}e͂r pV֖=Dc*SU ŖtU ijV*76oL|4KAg钝C ւ7ZkQ{_fV -/e(~I,Їg.p>>Efל5R񑋵l|}N5_XSz烜K"j˂·nIkf.+`W#фn^= ZK^ -&BGWerSMG0g'#bM y3?Mm ٱ΋td=s± (/6_}>I"3v0a;Fq/&AȟL'QRѵ' ܓr]]#;JG@gQ H0>ѬpW$`zQËϳ9 ~=IìWb;nNCjOpy'>/,z%/Of8" 'c6:!*Ek Q:7}??{t<Kp0v8G$`kk<< Ox28td hqh>&8jt gٟk/_zz_7/s\>:8X5 Iep>o7/Sp<o;ٷp\ͮ1LGN1\txVW(h%uoGosBqS?82>NAco68 \^d*Ay7y6DS~A눸TgEg&mlxU|~ߟ̋ug-˧ 8Om ?}6UK n<o13? RE1uX8q"=NMo?Ξ;_WwzGw`=@՛HMV|y; #`XGi's⩟7PɅy>7*۷@3_퀢 ~%+ c/sJgdj}1!=ND;ϏOƓg?L݂[0'fǿ_rk_ QH;9.KW6gҰR>ܼݗ+`.Į+s9kLz&b%0iPu)C)"'hk BOa0c;aʍ-fNmdg;٧1 `Ds 0OV߹m_.o-'/jVad t/;vv P;;g篴`h߫ ]D\ T_X VEu%•P]\"BP#rR% u6`co 85ôzh<>oip0*N.нN/;:{$if c)׷Φyw'7^:;h]T{u=7 9*J$ŅQa.h?RNmOh9{;}QfÁ燉``t L<5r=*U9{D#b0K5T2KvLHGw:I`䷩֒[5bJ#e=wX' Fnٙ6|`+KqӼδK~{TZ2:`tԑFwlLUh-*+fJfXr{>6 9EȻKtt5xN%ZXP ygWyʥ!8[%ܸeEJ0qa%ocJ%m%XU6UsכZ뭝Psr7 $3ӌ/oDΊ!~A cq Zi i ||#OQg'qXHD^=VLߜ}{0GuEsg[L6kTFwo* U_wo{_OUpՄX˸/\!j۬s6]θ`dԛ_z!,izว0=EFlO{zJ;e07 _m2{NS8-Vqkܑ`Mj.&m BH:] Pt֥x[RI=3!yJX%&Pl5p\EzҔ(H\zMRm`Aθ'Hqb˲ e2rsݙDݙDݙDݙugj:Q3ԁmTSR8BUd}_y05D`M*bK.^& }p./}AQ@am~|L_?@u rIe!!.P%V ֑Jt׮B %0ybZy% Pb2 TsXktsaT.(49A:1jz~'Ib$&Ib➤ fuSŒ )7|@R buchvm)V'P!f4RvAG-;I8`ah*C0)" ' !iԔ1-nMzA*AJI3]St JV!Ǚ#̥s)Rs{p)vFqNV΀ᥡX=丐mˏ)Ҝ-]f 37ZhIt%эhUm]SK7՝d!oJv$nJv du70ՁL!e-:XBA fΟRM%7W p:h&X= ,4Ă|bҠDlz3Z0S8Btss]KqV=G]z'VHUդLr, ZJ7WI)xO(]¼2&,X hG`zͫ>]?oOrٜ^JqWd͠8աR sT0% \eO'QFhٱTk'\k(l$c%j,~)%paVL2ioz@'%2qH!x5P5˄jʻi/FVh}wJGK*qfIQ`m3ڨޑXl0 q)d%|gosruoVbRSR 744M Nj ɧS]T ρn+Ucsrk$\ A,xkA'v쁕sLږJ2c-B%#rm+QϲW(kl9UVO1}% |9g6`7#X5Y=L@`XN<:$Y!)B˕3NEU/>W`͒UiZk HwF%,lϬ]I/J93B#a,~F.nh□G|TtrJ8ô!$TjH%f 5Hۈl!n:DJXs{h&[xDN]pY37"PcXv:|6v 5'cI ݻ;M&f<;K278dΆcѽ\>\ݍ&[cŚxĨH#"aRؘl&8F4Y|u¿Wicon){okKn+(ik/n3V%mkym5*Kr-GFJZF%+2J>E`L[x0f=>h%wT l+%>4mIޛkNmu4Ѹ[`Vi7O<*Mػ0<& V5`YUeE N^elagrHN^( iNOiр`\_əz]_BO:߃&"'zXޚfxyvs'C` ߧb#WkrʰL2"EAkm+GEbvV2Yd tf1mttLC5o+˙YlIG6$ i;:)~*~K'LLIfPz)M}rw9 =L力5ZkF9e*AT[mHFSJxyU.y|"ιy(sd-OYzZ^ߦAJ5f&Z>= ӝ ㇻފXʠ)q{x'%/ɭ˛ws峟. 0{ONFlv3_ć]߁  =xG2˄JZۛ8aV)SZsNh(C2ZB%5U\4aѷךPTs~kC}Kl;,eAQ Ql5nOwkd.GKNwp+@spw}3AT<\>̮צp6ێ6|P_VOTEՏ^R[rKuZnN۷T@whM\9! &#1C=Rd9 @hP9С/1)7^bQi97-G޴}TgC>TÍLg&f,6#G1Š4E\| !A} 5ZN+TbUJh&]i~.Q<5J)YS(jT=9fgtӢ35-.}2s竊>uTvLHꈭ;5ubZF$=VkuQcbEN#:rNToz g Q]wnkz/ Yke|^4^+HS%E>4]{oMEk8ҶiK!b5akU^^O{@y>PքEUL\8Q`ѻmi֝mi֝uhIX"p,q@)Ւe@XD:WC_4(z˪~(+8~':{u@>y,ˋ'?%jx;KԴ:7QF -oªxK@}Ҫ ՍV/V ;=:P*%3|aV`Srq}y-:uХTR=/rlc2p~TNmh0)#=;ǭCc|ct= nϏ>Lw/~qhSY l ecLˢ/]].)/gVfW >8 vh1v: cU<;Y15jqʺLJpJɊGP]^$ȏ}wUoyNj2-BA-O/OݞΒT=\Z,8~haAc=\Oehq1{ TAZo/D9>}˒F d=ACˣ2JpDZe_αNAYN!Ou"_Xg20c^iN7zA[QM &kzߧI-^FaXZޓ2!xd< SƠu>:9Oa~\c9::\3˟ L1EJ& {Esbi`AI1RL8 j)fLvbS<2%xm"˔eU& :Ũ=R+"S7|brgVZݤ)d'Ydf4ZKx ܒ gCŠBy\Rjɞom=v㤦-7N;&Dǻ״0h<_xy PA0妈 b̊~jwAm#&qBF"OH^,"@ĀŴ1KuJW66t`Z#خz6+"%18NiC,[p#՞(!y~`PSsl-  \O<܆SH҂!ǒE?&gǐȬ3{]h9-ceI2O@yo%jr9dTl0C}tT+>+ Asfs>o| A_g@`,W3r>,g|#itL|(Ef,d# oedY,uM!lsY^ÈtCk3RTH(7N6'gc=%" ogzn׷̝zJĉo(oFgWr7cs(y7ni*& A0 .7PŐ#9c M4iR34ÍeBk(Cb`4L*٬e8F3j7/Cjxue/阉GxO$sn"D j&Q@́,:aQ `!m|C:M/L39zZ %Y`c4 )AزZ-9Di(N I5vşf*0NiM(b3tN߀]"3R0ZXP!GTג=A BjI㉁hraiub8@VIc#9-="@d!b$ªxnzF3-|X"o((zhߍh[Z1#&~ lNMSM([$%K0ivIAmqwM#.H:y#V)Iշ=WV?FikSNhRh&T{7,Mk.; KyOoOӯ5q+3B5ͭX҂k+ e9ݳK'rZ* Md< A{ #m=8@^`Zڑ.[N8g>[e*F"},nZ kMM^ 0#4izi +1+TpWV8b =ZX9=zǫ6e. qڻ!ft\S̍Մ (aIҜY9ƜRm"&gX ߘr@S2j I;=yM!(ӛKs=tMJnǢƛCڶj\)k0K,sbGKyu2e:X"( ;g}pH/;|qiً\P}%T%(V+U %g @CΎXNJ9Wȏ6%~ rO6{ܜ!'ڲwP9*jPMAZ>f[k2ۗW2ؿ~C&"p2i.1<ʿDx(|؃iLS2;c ;+M[5*ƠŒw#1B>m(${sP[p[HomozB&=yN{.7R* HdJ+|bwQKvf|2K%E~zC/-'wO>0#3ȳѕ/ ҝ|K~We%h5 [N_][{w]#M]oɩ9tZl>^QVfPRHlbI扤 3iv{wr"c 3grMoczT2;VY rF91RCE^#R[1jiIaq\{繪zQ$;UdmKaWWdzAi>ۋT%_՞9=~-yW5gI{&/kb @: _GnYL3Mᄋ[17l0ЦXvrs~cn)̷2䴐i!B&m2l﫱YgAFK!z#15S"YRMP[nҟ oͮs4fތ>63H29vVʽm4!˼yh%s8kQY[ j%5{UIn1([0 S% ԥEDM0a ,3F{6 rFjߤ)VMVP-?qYO;y9l.fd/ EM{;'E)Jm٦ Ӗ"XU-S,^.}_X+A@YX-vImNZBEh#u^mUJBSg{Lh`PW(NJS#^%bgf~L `z's$1jTʥb9 2lKT>'7%\QۥWx)KĘR/&OUgkcu@FwA-@15.:/׭VMqkV${I!  xA q %s6}32)wU/ ;ٚr=DL4wɶ/G3% ־%?FxtF5U =Mpy笭:>]ɥG=iy?xMˏ;giH-=#ⲗ:'ms;߫INHl~Sk: &$h,oO~SWwl' 8뮲Mp$i}W rM0ǗVqı6FM""*$NN9rd,m%7؝"|ՠ+:`rU(ʹs\z{RbZ#6S:YfuN=[`%,'SJ[MOPUإRָTalD&$`FJeq[1r5"Ÿ`eJUi Xe٠6MThKhڔUx{qρP-E Ā-S9T [<4Rǘ%UȈ cTAQ%۬i/F\Bjm8./NJR21zRP("SG9X |$ xRH=Eb6a?-Uc] ے6^P+LTۥQo5./,fBBȏNA= F;=?f?i|0HFk4ew:[Ŧ{3VQ !ay_\w3A{&.\4:xnf/SQsr`s #ڹ yP,:;@n(53d`{\s%Qo1jRB3ΕoQ4UIáƱ\H)vyhT"}.i*g].k/DͤMj&jBRS9Ëp%eb__5{l8S45oK'5 뀇})[TA^ceoVuWu |4״H`rzɆJ#WE}B?2B67g4Yؙ/ d{hb3)L" ^leOC 8J֓X^gS#F㗙 /v#qxD.UwyO E`aOd&0~?$rs_ۓqN0&,\a{Z?ef~8Avnm,HQ"Bbꘫ0LƄXJ,my)wv2^X3.9 Bg]jl ƢW*r}(bEkEHX0EAq$!0dcL)T%qJ@ʾvT+bOƜ`iC1K/NA +IZA,B[^ ŭ!Do\ETG^M)jUL02#Rҁ='Slg54k\\BPDHE f)̇ǔT#%D&H DҮdLc0E@"J0&۹cLl Q :/ !u: E>hyWJo>"m`RLE)lZ.B1b͓}~c& va.t|_c [3}x?>.; uO4"ZyȆaL_Ɖ]~?ۢl=:*u"5wm0( X>}s?'GWXovRzZ47YRR~\*}ЌȚZ9{2gdvs+hwNJ:Sm'|8pI AaN};rd_A((i<;.Z-yѫ7 x sY,ޘ':`MЈ&¯2m6q4&Ɔ@\CB$TXDU\f6z^kƲ}AAUr'I^.;˓Q@n~AeW3k"L]NlcAE=HFsd\շף)1$^qy@A-_-UN`qǒbIƎK.`tNFkVP+ IF8֪U.eרW)tsO\3Jt RMGm KY߃RWLiجYȐFuӋB̈́>czf.*\ yaL Ne2ׂ2E$I>Ts0!9gn9Qg8 ?q4ڞ>.Zi1{M/zS|wOjg A|y"wN>륷f.Kqpw/^^zτR0KWYw 4ˋW4K]W[P ^A|O՘Дr5*#8Bv̀h+UMGs'v}Arj"s}!TMŴjgCz 4 ~[UQ[eonq`omz 8ں|#v5%nٿ답pLh՗k ԩ}Z,`{~ ޥۺ  rvAP5JTzmy4p\<b⮘+ϣoI4NS= f1eM:n?v ƨ/i0Șh˦m7n! yt9n^Єb"n97͕ń7? bɥKL˂ùq &4"؈SʰH "pIME1L] zs\a]()鲪ebOc̴; [,]fs޲Nh-fd@ybs?hΞ YN/ZE!n[5qW?ݹz+㐲[Ҩ;ueɣY"!s$+' .ҷXm9ṃ,g϶[']w̓FI(8յ:ZZ8Ayr(\sowyFBBS4>/=X蟃&Џ}p{d}:uo8 U҃ BzpAH.lӐYi568?IcͩⰗM|Or'6BF E>Seƭt]e&Aݎe0܍He%s}G[%;s!-Oc<`1/9_޺J"y*,dQۛ4U+$v" 5$ۛ')XnћJ9qp>a>uԝ`J)Z^n3;}7f6evAx˓eTa*R~wo|S#cBO!7eK!5%T FF|~t9 a~pƣn{oݲ/O)]'W}tooGZI0\7FqiEZ ^w$2FBd^u c[N2BP!InNc %(hWMU|49_&6/oU`5eW2awdͦӅL b×%*rrQoڎ[5ėR|1,­eɵ__12S0V 76?Ǝe(]eSP!u2P% ,"Bh%I$>#hLNJMfƤꎰ%|΋4 &Quvګx1rPo\ƅ\}6yZY k-/֞jN@_)!B\ (r_XθUb 'Fl SEt2ITS#$eHQϫlj36)URPf6Z|=M{T  [>e/v#kGۅaq)#L @F 2A:U2JH0)T()plS HU ɴ'%rf8,5-2`L"qSr#J2h--֮|<[#Q`žsmE gz_1%h; S2.dpn^΁=>K ŖdEҌ`l&YUUQj,! [ΕPkW(p4XgRՉ7!Ҥ01xsePQ @T F:UzPLKNV_j3-1Itu>b8';÷n@sS@ݖEdi|FSDr:qƑ"*]?$8pkNpF82M]%}~_3a/[b~ _gLQ2=p8mx|9TW:H@w"(QXzu~;JtagƖ` Uނ2ciZ'gI,1HS8Msf؂>zа~kd?OXZ}RTᐔ9q!'+X#]-,I|>[o]dv(4/K8ws/HAa}kӛ;t}vAL\ӗ:ǘ[8ʰG6]Sa4CLݫrOe6gF4:/YSB`#bEFǍ%ynQGp oBݾt5@ g4E PPR; 63d`NX16]ZƈqvׇݬʂBXŽauYotNnP"}SS pHZ %m{ٷq̟=>nhQTdϳ\I}nn݊Fs>/AݔH}eej_Ԍ\QFЩ~Oe-G{5\~Шib% z4YtWb"MO5Un7EOwS);r7O|y#eHzbEA9IQvs*ţRpOK4 aT@cJZ4]ZqJZ Ydv^wGYi'AG=O1T ΕF2CIך7RHF~Yd"R'QZA\(-i@ ,K*j4gt-i#.H?G 䂐pwt"Jj c|ͪM*eIؗZPb` tI}pJJiՎ|5; t K=}羚5Ϋ8FՉcng*OVsAm[g.D_ u[=&Se}+HԵiq, $م9(B1}I*\TVN;Mrz{ˏKz%l^VBOR%$UBOR%^ x:CpGo%>.v ǔgO5)΅*cIiqQ:/i=*_uMURXC:ŧ ;*~Jwj4f՘ϱ6nf/nBa99EQ(D#w$e&){0IكI={PwLZp4 #Ez(cp<aYL4[ "c@]a 'Bʹ. 2cI?&.ԆvEZyi#wwL@Fj*8pRr7BuNBuZ'@E` KU~٧?a h1N+k!1C}(TO;L\h0Ue?|G}Z3Ne1Ui4R]|%1dRyySK~B喺'noN Eu$FT\I_8Y"YIe`7pj^\QWc|ϝ 8_3IFdIѲH J("‡%>ʣK8cR'jm?UKq fOFF.aT,g#z, K^e ^zߊtGFw&_H>?XcNpxtKP: ti=Jy-< fRБ|=XAцi3 6b}4LI3h v"%*Vڋf^OQ252up[@/5׸OnTR,0RIoYIqҴgZ,r`w%`'1Nlwt u8(":]eS-1w.H4*U%uA+X:v֖$jFSvBƚN (h5BKВ!h[%]NM$_0jb12B鹲%ʡG)ӼD<ed#qH5Sׅt&ݶA"6 /=bb-\Uf;& O\I'kqKIA px+>tD5Ʃ$l n@6ApFAZ>Ru1z ӹd 酌[idAtTmR׬}=c$&[X:GgImĈ6c.ѥǦ,Yt#b Պa%uЋ^{(Qo^$ .>~*]~%m Deuf߄.oCdC8 -|gZ` кac&95ڳ\' NXC@DiA%6.o{+ z PL]!ҏ;4@Դ-d^li_rT,tZ4Kҝ͒#.R7K᪡ruR)4z:~@2-CL$HBXKJ-Dma`zrpp]@xn2\]x滄ӣC?!ċ0Xqg oz=Bj?~oW}uWGLgGIsį\ϯp suM.= UݓC/qVc' ELw׵ Py=vJ:Fv@yZ+Zu^hvCBq_ 'rx2? VzN&q6$=e?<=~Op>x>qCORVk$o\R#(Q<{O1 =E @b0Ʋ z7yۑ[.W!M=1x^kncZptmhzi}=u_a-t:ztfnFOm-89^>aiWڎ8e㛀H}9~IThK+J`R1[oJEH0";L,gH`(3mƬen{WbՉZ" 72#cnMlKmg)}rkJˑPX\U4.׉VY]5wsYv OEb:O_ ' =Y7~r@cyo!~ fPį_։y)\Tӗ:7[ʌjǧ(:X:{ `I S bIn^z턌P%sSPFиJesjw;jܜw "3LjF2jK.l,o_w8K-;lo.*͝ >^-<Oq^Bɹ3\~8ټ%-^'}MՓ]$#,ۜP0'k7uwg@-)Եj?gIMwNc6u[P~8_Y>|h&T,pS[! C-3ghc9I[9ŭ#,i_$\qvm$' =Dsk9('ol?푼7Zr9Zm曓,j7q$n -=n@)BճsH84x^un٢f3 p۰PSϛ385#+gFϟ}mF_-8j47gtNG]f6?{_|?jвҼq(;d`$TIiNT+R?=-C&vNyiR8:E!2C!DI,rVtC;hruXNUk1_ p4w+4p<3^`P*G\G٠8Oְ ےlXZNx\}pO49hc'缣$=%x1vgQ9 px:Q>D6Jqa=1X`\_ggO.ۧ1~PEjW̍4?` xi4IE EKK󎚥{3F]IYnv)jIvTߨnpH0pYأGh& %!9AE3zti>78:G]>P&J?tTH1}I+rQ+Lx9EctKd7JIJIzܧ'eTDeB)J\iuiSZ`62(d$[U֖]GW䯫 +B\i:U9蛳q1&t|o464T0>$TqN{}5 D2yDpY(0IT(0Izk C!' ji(hsň Z%\ 5`ǁi-т2patX {wQ}.p Szl{Pg[B^hogs_HF{3Z0,~GhFJCJiDP8(+.y3ol]"$sгΫ$%?{WG,6edF?.v0Og[dO7XT*f塒FdU"Ǒ4YS6nU9%3J,] `#EO:d̒Sl^k/tPK݌b!9VQ~~ϯ!uvĆ'A!ʔUU q),OԳTb{wj;+slƩS.W|;%9tXl%k]>a.Yi}}h#N^5~~D02 5jwp&.Tm t|whҿ7-"虋IT!T,[z* wk>xlxw̄G!jױǬ:Zu2gc$Y!n8dW뛫?ݤ~_/n>3z~zssSoxJgOB| S<|"ӊqSFNu}VJ<x8J*9^E$m uY[9K9gUS!AR&kǾhTxؗ*w8&)q 蠛1 yql7'&YQD!ESfD֧$ KG+jiKyiyU$k V\ c;wvkV_@t}_/O\]~\^Oy'd &(%EYsɑU%bio>~xzq߽}y[Fxule =v5JU7[M{?]1ojz{N-yOmoր Z$$H]]`>04hxGj[.r4aeکj~̀FUMpx1Cn]|%F+N)>z5uynǸGa9L,]aM*9 "1{C{xbq=)j}_b-Wp=\ũ}S@Qk}vrˣr]@$e{ [e;){xt")<2e_q iptgk5kjgh}5a/r7|0 s͊ YfeQu=~Cpj^WiF\NJxm&6|95 ;M䋏 (9Kq)m,F'hU۬'^jq")PE5C(#Q2BHًhAB 29^ݨŁzA@i$`)1FJxgwg 4NYk W7ۊ Vz (J%h(N- +=7D"ʍZ>xT ѴOEMQ'fCr|us^x,n>^ڈruz5>'(g10H{P!D@+5|X9gZ>M@o׵G3>DZ&q43zUGA(>(' ևխIQ.e V#ֲ˞H.nzjͫa@X*(p߄w1(ſ\ dY. ɗΉc(t%JkN0lqkڐc0i؅wm*; !jJo[焅"qO/QOcvDz Z=_ 2W" [ E,xj"vdKV )Mv^촒{=a Y蘐\ы/vխc=\PaP$|@}-d+ly:0v`$FazcHvoZp(m;xo4PSa' VQN]vM lKnA4!P0#3k0Bj1tM <ע6=M75CeڟMDvNi7 #5E^?fA0*,s4yxjPNvhu {ߊ}f]+\o*MY<7g~Ah|PJH91UNc6 9^K)C-)#;9`kX\c_4 ̹ؗZ sGq3Ћ?^^ y꬜Ug:+g}g6ygSE )B BDW #k!(,( v}h/c_~076zz ºWp})|:ZH!W&s ع #(H $%} @KgɐۭsRv\ha#-EbQ0Z+U4 0! ť~af  ! fVef@k-R 0E5=gS!s6Nu=gޜͼGqJ(kf.%-Zr#ss8s|ֵ69cts'vFLO13rݑFc1d^֡)œ|jF,I0QAz9C@\R eKvui1T-L"6M$"NZ18<ښo" 9-pZ FvmkH9^.wS.M{Bv1z=>r"\BA,;8!:.Ϧ҆Ђ咦WO-H䎘M7_Mcv*[q=6TCnQomu)kg<9h\1-pK7s܂r: R Tri,owL.5DfXgQl2 5BNIk1 YGy<׾&zGzt ,]ɇwwb9>c78zlZC";88 wZaSH">Wi3PD@ʙlY[ KAZ]"C!d_Ah4u0Rh "P:OQu3#TFZLɧ0xOEh$lW>Ou`&hưFcXFB9QG@Y_cpe2, r5@Q w QIyK1I AN2V*eb }3#9|q1,seVn*J[b4UFj"d uL`,%#s$^`Pƀ1XdB` xCI;R$,LriRykd/LHFAe6DtQgy#DeyP(x2s-q[*zǪ6 CWBuQK.Q4K ޺BH ]Lq dJfzv8bJXBbHU X2)y$0(B,)*#k0A@0 k0.H^)"XqmYYhf ٮdyAlECu[6PFDBckZ(A3(C]V2"v\X* w 0BڍgSl Fq Zo喟D-?anrx0SW#xLN"15[)Z=Hi6whN;}K5-/N^cB[l̩q|cnel]6in쀰'WPldZV3R `7)y@e7ם&F 6C8pٚ-c"s#͡0b'Zn;C=HZYYMfC0*US |7d#7ɪ5'#%K-\bV)@λv˘ְ$8sז4f=ʟ'F$sXco:۵3RI \:Ixzn?ύ]Z;xm݅rZD7Kn`Y\^&qN45ρ>֙C+ZJwo+=#]\8Wu*jq);m3s=8d:;p>w$Dj'j^xǽH< *A$#Z3-Dڨ ^a|W/ii>~.6<5!_y@X<>:o^P,.s,Ү"_XR -58E')>)( ^CȺձld ̨]0K@ru]^c-(YHbdCzJkH#s/.9:cf3)$TD6L ڪ{G։K0XBڒM>0=` 0r1j& K$ 16,jfM.Xmf̛o ?{q@/NBz v8(텱=39Hq=3v,+j8}UU!(ܗ09`_;u4Bc#K .dc7;gr@jA88NMg`^Q1µG3tS4]spTaO[*.AOO &Ek=1q4\/NGW2A5=I?m+]k17ef$pR*9"W_4W4}qdl"h|oF&uM~7\1h*$ 4լkJ0ʍT者tZk)ǒkx;Nʤ~-@uc#!#W_50_6Pk}1ٍ.RSJCr*҆ݧX2:̶_7jN7l yeb&gm#dڈ2J ji#Sk(yN;`BQAEZD#BzD߾}qh$kY.>r~=w!0҆}V``,Q 2M-WUDh" [A~ub=|6Ə ZF0{(NF Q;ch@`\+֋<ZO5CXlʿ%2Gߦg`wS43.<78H܈IBF `RQE6Lûl>Hι&WW`v\8XdHR}^'4sX#Ac U";{|Qr7`aӪ U4͇!SI[IVJP}aqӫ&4z@oע(fDo0'a12V޷(m.<5z8kVO/j6F=ܴhՍNjcTpT[g''gnKl-G)[c"kY2k)wNHqab>ok;zu{kx@{p@Ka,;*\/:wFZA 0罹ͮZpJP~7GSn0wǘ$ ""9%JTB{C,ZጙV{r*Wv*k)FjZ S؋x(V+/YYt<RDT /@0,Fq4>GZ.t{LXaGqh";ɼA+Eʆ DLWw{Lb( oCbS1/i`~A-X2JBY.aLzu:b=H޾Z9%61ڀhZL:H3d 2je&+ v`rYmYI$C4EUvN)Rc0F=W n7#v]Zt-o ֲ b,F1R sFr#R)9S+xhfjl$p{No|kװ-RVSXeŎk/aѻӛ郫g),oN$3?/Rjs6?_\onʙMmT=ON/0+?l~2k 钵˞Ud9K7٭鏧iܧwa^rQ( E4KjYNu{f`r1h":mnseSX fݲԺu!!_fa'q>nղ#ѺbDtrhcݎե̺euBBp=X8gX|BZ vw/ŃE2Mҥ+PZ4}7nyh{s%կ`\;YrlZ߬- rVҢp!|l\ścu6Pf[]<)ߓ(ei&b -sǦ)L)򨤴JjfXlf͢Vv@J+J:+-H)}sȱi(Hȋ]'xbDߤ( :U~ŗoJy.iLl[VH]\aS*9DNXE_%\tr*w`9_GͲ/W/fs.qnB9,P~dp`p(fҧZ/)MVwoԓ*:bOÑ1uHtto;d寝=k{okCބm~ؠ;r}^=I닃Yn B:b׫~J!G,?Y ZZ˸=JJ:fK=~I=p^׽=~^A~" s^A!u3fpV0e} pkF|-y~q0f:GWgJuyUoE]: ͉pT"!#b1:G2rY={Z&le)C~ha GKuF s KauuZz,g@x =Јm|FI6A~lgg4êk-qVV![3m圧ʶ\N8ϋ#bI1Zhы6\PA=O73׼>S$GyhʁB#!#!hZE{-'ծz^EqV{")Bűn7 Rhu/ j%k (07JI']$|Mxר(5Fp9qdl"|oF&Mzs涮77ٜ(v*V CS z pDrLG 3cVSCMH5.~J$}.ަBeµ$.EnCŢ'ClcҴf±n`YVI:"ầϓePm%Mް+j8xuIذJmǮ44ek2J"Q`9r l4QR1GD)L1ɈvHG42A5\ćޕ$t/ND@/h65A,/s8&V!L\ːW{C @!Db7R9Yf0F'UҰax9l7N赕.* 0h- Ga]#SxGSV~FN&Sl$D {Jl#x*+r 9iq7-SR@3ɡIC`h#}BȍJ 6]Ľֵ~}uE11"C+')`'#2 UD`yY i-@/yCF]تLgm>HT WzvĴy (vhZK );I)aq3ag̀10_7PvVZ=lZTjJ__q^-bXR^pݸ^[,nFWښ3Ҍm ;|np7;0Fj* ԹjYV8:nPA K$H=Cø-.2iLOo5wsBVO!+'`ŷ_C6\~J9qq~s0 D^"?VRׅb'=`SrYhGv?dLIP *YbAȵB#7OlYn%jVqxVFI"F#mE̤% Rgqo^t0{&\ AMʄwK6WlOPΐte=VU߆E<&#J5U @x ?Wq@bTQ" uic hCTY\4-GB &zM.>u".~S+_LUXܭ?R)Y*TC_<+=+k1G\Pfǥ%j+8qi᪓Eom` Pi,fUgM$*UAtckӾ%&@_y(Qy]@%!BEMFEFikܺbj"cJiDdg&(k}T(H%X9b] Hǭർ9ƯUi7:1~twڍmv(5ܹrBr`(c섶$2-9SY8F8`$y p(AA5̀92˶ڻh:RKhZyڱ.?BwEtK"0G͂ Ԯ坜Ge0>'ڢ Iag04*k-x-֪׊2p59wdFA\@B:8k53&F/{L;)HXhD{`H'p@ۢC[ Ŭp ' D:y yײnaIX|Ycm)U5L?@}w}(nNߔU}9M|q?Y_y?gcb}]͈)T~F4r? #gi͝MnIJg>jGt˪͈1Wf;Uwa^Zu$!_Ȕgg cNNbDtrhcݎfݲԺu!!_fTĩzϭ'ar1h41fw;*S=uhc߭ EpR)e_˟_7kE@w˞bpݖ65˯>GB!)B.)Uw 7C;Excښ۶_=E3yNNgu&I4DQrf@BɢE$zE$a$͉mvFxKnme&Kmm0YjSpR|-2g7^OW2{״}4*+B֭(s!FIy`\ǽl|bBUϥWqgwdԵW,ںj9Q+(;Xzo aVQccRĔ#Yb @XP@ÄD0DcN [PoZole&zABX-K7^@겘< M)^ NRSDD:} Ȣy-# iy[ʄ&IlB0jn!bsdJňqdL=uХD r 2_(2XPƣ3D-7*]hF@`3+W؂+:2 MeZb34)Z E0,z v 8hA^<{NNxrLVFw? vOorugN)VPw*dnO]Ύv j2`c5;h̑w-*V !27܌ZHdS,kYF9A5ڄRHXh$#߹Sſ[qJ2l|3#ꭉ^Aфk8mX&i- B#Q,=.&Ufd!BV?ѳt@'BCCIߝ Cu1Rs#",ڙ(XQ٩ZX VN-JP R?O~E:K xJs3K!vwz.*ӵʲ_Fp01Sm}Gbm I.q|>Wus_Lhf s $aa7'goDZ3#3HXoHNӨ(ȏլO nl ]r X1ri>%8^cnh$bhY,iqR* L8c``Ec5bGB9ȂhTM)7&ΎՐ3X1B )% x =fq<{J:c)x9e<2ڣpb0ԩRx=jF`j&x4'gNb@W%+Qxov*em;oX8«:w=2gIƻ򝏱 =]!" ء(VF9BpU"NTgx /syp\=.Y HߌHySJ :.wy rW8FnkP?u{Ϣgy90x0 %g? SwLcMp @0B\HI>3@0ـlD>? 8=M:6"R#_8}7BԴpAUD0֘n@5RC]Y|a`g )=[|s*N|Խ[&$y3\!y*6OC6 3dz5wz*((B"i$5*!sr.8;XQ&58w'BĘ@Cfd24b0V\bY[<.l&J"-Έ+'X!T0m}c-V uU1 T 'ɼ -;a&04ɫӨLyeqw$P8zEctT5҃g(ȣ3db쇷B^8m\R<`Hz dn:Aw?ߪk2vҟckCH:mƶٴM|6_gr'矜|/%7{[|gS}'|Ʋd[C{[rعښ /:_@OnܻH~HC4k2d Rc_Ev51&w')kf;I}w'۩?лw'U̜ex?/ bu"T)Du!܊Wza&-}k2GbbNu%1q۩uIZgj`3-tݿ{cui[zVߵA|Br>(ݺ`SR~ɟ'ԕx>iS(ڄzd#g"Eudce"+[dJy=rk]FѭhUދ(ܨ[Q' ۙXp"R$" AH)$m|HqyNBHi͚WHLzkhg4FzXflsU dA9KɝWVhc.'F#*eP :r"<5wGTnKؚdpҤ N[s.o-`zƪdGr<ì3NHKex+Ͻ2Ed^vם޼ TNN[~tuRaFSd~x.7ͱ[ 0\+M+](aC!cabņ2/F <7[QH$ SmΨ;bk߉PFsQ+gr\gw5g C\w:&DfG(ccN-)Q;B fbxc) ([jM܄B5}s55!4kkIr ЁiY.dRsm3rrK̼d/̃3r`ˊ2mx̚Ի&!1in/ݾ!C( {nlGN^aJ*_8<( "!+xlQ ^ЍÐƳ ΄`j֏3|:KY$ByfVʢ4]1}t=(:QJ;B_a+1qBf=# 8WN1J 0bZ fAB)N0LֿО,٭ ,,.+ZQ:]"Q2nl5JlnZ\7j(MF_"M}}/v~EkUfw6'wB2`qАfU)ܧs=`h)Ŭit[ =yZ.ee⤖*\Ϋ~OV??Iìι2Pjʸ8NTn\\&`#r(PV(nww ]rÍiCC5:& oB-EŝvF!'B],`ijEo? cIyGzpAc$0N-T)n${ڞe2//J&Dk%2WY_PXPOV?M;7zb“:^SO>پ3-kT60o`QAZpU#ਯH~L2-鞽WaޫgUس*goޒĂ{J3h 0 fq)ccR\$VTɒZ]Zxk~Bq``K $ܙU6՚΅Da.$ s!Q s!KY{"K 5)F 1MZn41HS!7%/`k1շXR}+ h,=ǭTܼGm[2(tbofH} \sL6 8D)% @A*eZ GTTP @VR0@x}@`Ŀ pȄ* Pk}jD%8ʹqKƘVN \UۣpJX p,H^@T&w),[@Q($A0s1R1R0jښ2+¯B5FPS#7iDը| >[\2#s[ZPH/()J{s(7rIB11@=4j ZAr,_2ә3%')itt)`D0 $́QΥE^ |$ $nP4+}J@><Ǽ5.r7hĔzs zOIRJ= J ( rްjtC*RORz@=Qj&xYi, l EI]jsI0 =%̦xȁgש@Ӽg5r؎֜o9jFNzN 6HrS}-Lb A$RR}h2sJ+O^ϳmO7zطOf_[?o/{kZ?A⷟w'}mޛw?zaJ.߽E Ioho[v#n/{f_=k=??F6YCO4O~x769[/xNm^FE"eYszEvL#-!i{L#8k; S,4NES1EZ\eJ5*!͕W)ST"IMt}¦Rk e8 ?2;ڥƋӈ+`1UYC0bSx= M &ZCΪe;)ԚA7L:қNPuNkhRZ2ת\Pm@vh^KU+-AUv(5jK@67>gI+ak (YW`H/% `Mմ~aZ YV<fEUf* s}w]T"ÜIFiO$o<}&Z.Dg5nPll(O'7dH(9 I n!s{ABBĀvXFK(k1*88_z;Bϗ {o ;*$2|gФ!HPDމdå@;/BH)u[BOJZ48 +1- ϕ5gpאTE/)H^+R5z~g[Y_q(ֲi[׷8.DʸBHJ)cu=~ "ljVN^ 8 cR61cM[-`l;aKѲk9g8H%B+e~+:AtCz;z۳촍ֵB*:-:R/h9#%<+@o)Cwޱ͔޾KyA#i7yL)mknj- Y@7'6+;1^MGvcC /EHX\(tA:}:9b "90mcsY >|pbm} ~<7/_>-r{#oЍZS4Flg׊^޽Uqmv:}\QiC&_Mi+o]5J IcOx̻εM#GРjlb5NiӅ! Ic36}ֽh(-56 ЭUoDoP("G_v\?n8&ZHA #L>-dL*3 SNs!'[uZ䫿2O/>]+ ?4%+a@f^ 4s4np3Z)RC)k}cQ $r|_KuyZ/gҏIXpL AՔcfE~l@QEYn'6uE62?V6gYQ "C۠lE@>')>θ`= m͢_ԬݍOmSZE]5j̤&L`D"fTY7vi(6Y|gK.LG}"㬮v5Zh4-6o.wPI_I/H 1q%2_r{zf'kCpڑÎD3 _?PbDae5yT^P[^iDؼHQ x5ߘ>QE"A;|.Q7 8v/ I處dtף|}6>r ==pVUlp:1յ)ɛ D@@&(z1 @8:i0JC;UF2tX6%ܓ3qV|iG1 g7VUr0;bߟ7ّ3s.0uO;p?c Dw`(`u1 z1)!@76 qph]3. #?'皂kI˄#\Sy밳Ж AQ3{DuHt]3gdԵ {>EpU>ǜOd$Н嫻ی"WE~dHA I\;˝rrQV/8%tݤEڷTg>],E#4᭻ KKM d-XLRE B4CК繩9}VvC|Ua6O!+> (o::z͆7ɸtgӅ.ܗWԊoL F`Cv_A9Dۅ/ i̚ z^WKNJhL9H)]Ln E8#,'[ka}Ry}[^63K6Ze/2!BDYr/&# Y"cB«({}q>:eH]BeR %H(g1ުiӌH@&5f啞ťY^U\)0}MA[ i>p²;V~zP4 sL9keM_OgXgT)fgX!4VV\t>\$-ꯗir6\t|cz`$U2[a*NqruZgN9|j #2`8+F:19b0uvNms<%f_&ROÇ3m~ǰmĤї蜡Fn_xssnMۊ 9BZ/G x>zIfb"ڝ9kqgH`%΋X 0)zwx Kq.pnS,p@H3%)BFE)d2!H%E&XQjccV A'Z) n VKқ,08kFB_Rߘ.RdbcLA b pV?-b53o-O-c5b6E[לiaFkO.4pS1hs$40s1,ZKTDz3Vfcw-6HP-Ju}sju(&.LVQQ<]IKI3bHd3 7&F,`@K Ni^i$ 9_ṭGΔ,qVJRd2+dLA %e-U}6N܆%h,0:K1+7O8RP Bf$<d%j嚬SA_,p:OV =i~ӟXKftOFR:6CuTS7I4Ya*i?=E]GtH hU -nB՘b,ֽز^Y!x _x^|sx8aRQVMlŜ GB!?3LfL2 doj.7H+=Cʹrʾ狛fRGis!ls)DJg[?*1/Mꛈr.hws2hL2˄E  MܑN-dEKn q" /! )ϓS6Hܙ2t9ߏ $01H= &?Ϳ?hֹ61o땶v6wo jC*LBԌ OxUӺ,*K+ E Des+LJV\U]oowz ])nfD@Pv*nu]YoI+_=E0=΢ aAisEJ `FRW"$J$EXEdFDfy5˯q2}ȫ:7|~9}пy?a}o~{}]=j`(nbL%ЊSA5Wro$L6 ,P&2%b\jbn(*Y-+}K&#=\˕3jഫÅ,BmEta_ !On[=$ /UXC]͗\޿<~m~䅶CY7ڊ>;)cf=JEʈَ4Vf{dT;?%k(^8g^-f[MhV |ꂩ>7S6,cDw!N?1F0#PnM}4u/= h}K شg9<mS 0 ο{2%dB;+&0o{ˆDn?-$(U˳k"1 T_]:R^V_^xcW_H Z&=TA!FPgN|#қj7yL`|2$&T=dv C(:HrȈI[7:$BzTpQsD F185zAgP&gϧ)G?Gܽ0:\k:QN=ADjЋpd:fo%c4ƃ/JM5ݘmAzcK]$9 ˝zAF0Fx,z(", Nˈ,57"H$ Bt?ЉH CYhdE-S4H„Ԍȵ }q/ JI+5BTĺ4.V۽OW\_C Tl+u0nܺyXo\_aHh %(w194EcX(E k:PtWKBo%}H24WBXA4^JK5ʋF()T`8IJJYI J hYyL({HlZg&eDf,$ P[Ϊ,GC:LY3J*CC7 џN@S4'4$X#Q{\P♲+A`'9Ϟ<"ȖUsh b#1ZtOH+S`EIcZ+'2xf*2J)C҇Fz*c-m@׺5f"1kkkkUWEmP֢JZx1nM*U6lWuӿeA\[ \1waxW&1BY䚺v\ZƬ]x^wiU>OˬU>Oi%mzՄP,CnAcN|TS*2US*AM(p=_x(*>'e㫩u,8K翰~?6>~cf#ga6ru#cq}kE8{_Gѣ8dgtufI.0Y4_~C'ŒPi7ZW1=٧$c?/ [ۧ{f}Nxc6HS)G?wX9O2NhO/TEXu?_}r<%>\;oi5#h;,-\L2/ ~p@l) }zc+@NCqS/</42dN69jc69jcXڨ$fZRp**e ոN#Pl A|MI丘kiw ܄&^uc  bYkU*an"h?IiܒOC֥ `s?OV,y V _|֋*o%mK. @^f5T-hnJoQa탛#KkN=BwS8݁JCT~BWfO߰kEﲇ~8;cއϣgw_>ـ]st=Kp!~\9>ϵ]>I" F>=޼̳ "-ccu!g{qΌ 1F7.cxoyZie̻y|!n̺$ [=ě^:zga]擱wz;?0ѱyo̓!˳N1=u[D_|38-m{i&BBIHmOhwB* MȓCI4+i5de&]2/II/QAIBt4RKFP[ŒLLtJHI q~;[ٲ 6i`{p>3QG p  rć馵:\9??醧 4!/T ܯO6h-: h2gHeB΍4%^ҷl]G69koZJ*ƄҤJBHtԁ]``ZXaɋ-۞ `i0{o~?+T.bɂ1vA1hHEiw/ڍ}8QggB *l !QÊ(fVl>XڱLչLoEWo )ꕞ6u0M7܏}<ߐlC影UFo&!}0WV"Pع}t?`*)6Mr@A瀊!"RIKG4sTu`õw~ٌ8qQh֔eX20/:*\lAtHWgD(8Dr8WfY-:w2MOCD!OC ox*.*oO%PH//LΒs`Uo_/q͵|s2ron~oX܎䩢 }’~H6Vi%0aìu\/gu4@d$ AX Ip.%TO d$IdK5QI- jQH 8uۆ"NBhoM}5_ ׆nK\Ǿ0i:.( `n6ɬliruJz( |u%ړɗokmk2MY_"E+ AZ e3d@ 'D@9+lZwEW!(MnI&\786}zst2ц䃁MKVеϗ(=f,U 4&K@E2A*Z6bR } r]UGoĢ)锛o\L1by6r MV T\>LvLg Z4`RzCoe8Wx;| :%c_L«KW+Aٓ_(00}$iQ3@S-WY=]ks۶+yez&|l7͜N;$ȒBJNL )YxxE "ϳbL5H)$$2aAlOӺȄ Pj F,WTE5 m7V2\ׄk¤ 65eXjs1!s)ꦘUw!&$˙ǰT͖VmB> jTqm31*_uchжLv0Ĕ|mn`{`Ɍ`WS)d,KG#ImE`8Dߞ~krL`,}D3]Y[KnqhjiYdY'k|8K7ΐ'FԵL-kEfbjo[Uk)\]q"WSTe B-.r[ M7JNEt fev@ E |S(ֹ۬MNV'XqpFwpc+J׾uh0Qfu`W"v1a!*uQ (tE 8O8?eQLb3ѽmnL*ӷέN2uƧ<a zIRtRnt& #1Γxǿ>2vc~7BKq`˃tه  02tzG8 5y `w kEL4x߿sLkL),3Wmo,KnvQ O@ANt}FG,u8sz}~NQbU17>?]7'+yrOo||s;6x;ww¶3N-Hd|$|w_CIρ<=x܂B_+%yOGn =ߤ^]__#⏛wqڗ:q&.q7];}L-?9{~|yO/gL66 (v\tGug*dho_d8:W6!(¥z%Q2|q^%EEh0ClE[TEa}%;~BQ*Bo:H4S:pgZo* 󼧙N7*E[#QU?wpӸN?G?;({l2>4f=UM:| * ??___ѣX듋9zsyq5ӋSsIGGI=&x;P]_cB znn73?SP ZnG\e -w& A2bzGǽcԾ2TuGq\+PW 78C _e@ħ)ч_A">@c~g_C#IRq8z5uп &JcndhA|,Auބ0мXU߻}5ʊ5?rI?0{h8 ^`T`ȋx6Q4%ڥ818gOrͼ<>E3SO1a01rqERĴLZL ̵՞ ڶ[pḏbSAW{=3/vHHKSdgCb>VOۋH5t./BDB0> w(:(zqv ҃~3F0͝3|k 5;(9 ޴J,pVE۾$p ,7̨_R^BJ͘ʮ==;&T@O~b֓"d8 ]Ywb6Y`fȚIzpx0id2q$pM!Qݚ2rBUZee*t ٠7U "rB8ٻR# –>ym ŽHaPh!j_9|+`.&z>?FˌXuT1LgP|8+g gX[xkL[ݓŹYIHP66Mw;mn"y]Q_ Sl>wfL֕D}vžLQ.EU{}7Î]ﮇ6{b$P/zM$JR e}` bF/8(Fh::ۅ'@bWwa*ZchÓ~PIEZoC%5${~~mN \/ߟ}Gg RJ\}BCosH_n|@T#RѤN?ƄT>38Bس?s}R-|[*@MUƇD x8 h pP݊>wCJ>0>LKk1/}xd&t枚E ,0ɸ)4΃m>Z h I:cuygfIڸ{?1S5OqܩNnՅq4}~{1Aik6K RL3V{I#x KEH;4B3%eReܳi#[,7I~> /EɻdY3X ir̅[!2?9-򻅏e a`vL>y1C\QNw/TzqKE]Ii[9b̧"ZYŁ+]1U>epLl]%3Oi=6iM_G`6Cx.ȃF)ޢLEJ}@{4Ej$y#GW+֒3AXr|-+[k1 Wp幊/+/b{ĸ,3բkĢ^ j}dC仉_w+Fw "Gg3)ݾz\z6"S׻rۛ‡RW^K!/sȋ{㴊uj[ ,Zvv9cUtI#|%UB4/5!{C:)}S2sλ! cOs3nycmd'9"w*5QXZB;օ?S<6L^IoɋwD}i~i!v%GXטyV.gRfG+\WSRp-A;AA$6d姝{׶ (\OZ;ԵCq5ZS^r G%M֊Ƽ,ŻFwRkGRC{bRsͻV_J;k&՝{ֆ(ͬpfciIM67XPyt*̥ƽQt`<uz7la!XvPy)GeBR):\gd.\Q,&:@l4^:z's,;!9zW:TEZHKDŽo+1 |"K5A: py.EZ*EvyBLi&~R/` ,d @8!A3G+TQa;.9TL&a)V{wXHl %Gp e n#(=:l+$Ѡe"(v6=7Y12!q)fJ-(tڕ\Ca. 8gUH@9ugf6&\cjۍ3!96`rmMr|Gk h6gvW& ) cJA$>cJW Vythq@]eZ-žگ]ʑ񇝇?̙r~] g;ٻ30z a7.zp6xMGLqV˳_۹aJlےJ`T̳y10ᙟg5u'Z!@x6Z*Y֞9QxxN/sn"=IZ^)czϗ?3$2}y6)瞒Z3M]&"D,][cyD#ٮX.S̱īhdnv|zZd&-/Vk (2g8;aMw[۞gA͝L\H7ZTsAgffV~zjH:xG-'8ӎ`b>$؎fkUi)3(Bb9FlTLIvm,yRdvûkJWu[)EQ% iPTdwBNrXv> J2w:n9KJfQVnsg+H<SV??~tLe.Kdƈ@BVWۺ l]+?(J6ɫc[n/^yITv.n A I\\e)R1Q{{ vhF^Bb#c898^?>U~G\sŚ zp8C/-%2s*h*JIٻhO$Ez\R2cxO? /^''4 ŭekŁ6yE*|j}gt n'Ig߮(2N mDY0s<߲r--X p 4%BN 22o:Z8OS2Q:j1;PD!!\,}T`DwJ:[Ioh%xS*ɚ;yO50gyր&(cs%};dl+ވ&6ey±r<ϷYآ92۰QF45xm`p[6:Nv.+̳mCS~U:\ߊu8]\MFCS +pgG\ٚw<{Śwmmr+Yڪ˱W$%Wd'/Nj1HK==I . .D ٯ{zcvLƁ!Щ@y?KNܟ=u )hkMQ fGpw95mq)F%, yc (VVaiM{,W.F1ZzQES_ -Fq㕭6ث]ړX{ 헐V#+>cxeMR< R}Wi`Z9{3&֎q /_OX<:zC xrTLVjQ wlҸXU@`Xe(|fGW!Y0fgI_KmgB7jEui1$ tZ{::fMe.k?6[6|d}I&n,NR8GyhiKU}?WWX~4a u2CҚW&rr_|A/nߓ5j$_<u2NI.:J()4,baBljQ]&;?w^e!}7'a >^\Pf 08=J+geC#;kLvH$jߒ@[&>ŘJv=lH5; J{$< y/'6t{bUFmG\< %Ȅ6(ȓ hxd3d<‹BRlR?.PM:n(C~ pP̷h}𾕜K5nqbͦmb}Bv+o 7]5W:fxhX3o|J 4~cvuF.P}%mĉzCHG.4Z,#\I.Z$*/u_yǀB[L~~?"oi?tA 0iGboLboCbAc}>޸!][awQR|b6.)bM}w;e@}rA߹Ҝs4IJ"k,&FѮ6rbP(N@0r'ޕ=Bnf1B1  8W6ngG^`:q{ϡ-p[@f; |ՂHŲ^Y9{90=B*a=lUgֶ im!ksJ8zѓ$ @=9R==9l]dw< ;chǝLe}_g_K|\^_\ڟ-Ԃ)$Tiq&{%1Ɉ^:y_8? /n]Dݼh3‹{wyxa>O:jY(8q9oMe÷aMߡWBEpyJ TXiJФ׹bAlS^MD(!:Lj!TO9F$#`UJ$c)>|JeV Ժ:MԺ:d]6E5A(%l+$F"KIQ,y!4ݐ{\>]h}|v~'+Ԗb2EDR<4&2.mвY\V :MY4?T. B@ކѺH -Vfg7ͤ:%pgrL*$*џџkfv\rMwbP'(Y +3dfvFļ2YfUotԦPӧԒ*\Jlh 0!އD@3•7*|XyAAA]NT[JlҒwc,9p@父2f9>CgOVg㖷SuG^y9[ g ZIa/[Ϛ˗/^tu\ߎ땄.(Kc9 cR sVf_atRDV}6I5H$3e6k/[ƍv^Q=7|l"RePפuPJ`n[%]ۊ &";rLTaҬȥ @d, ћ QPj_S2jrB"0S D%$LѰjyadS oJ'DÞM|terk5J5lgh4<9@K] ˃DV!!3[ [< /sЬޡfaa:f\#rz (R%ФFسC!,Fjt*93$cg 9~>,nssGD2Bl\Q KCޘpP$SKP"3(RdfG"Zhr) %2 e~}.ˎ,kXZAT^=Zw]A=n{'d&N)]T˘ Z˫=CiٞAj=iMmyԖMmyLsǚ9NW<Ϡ)[M{8R$4״E5RQ൱HbEUr5@JjX5ƭ&1u Ú qN|ԞooZdM̚&ӠG 2N_*&&O77Nگ_Noyjˌ;] .b#$)v֗ EB,e6:kOIZ+FB)9*Y$c-F<yQg*9 wmX~ %@,.$y ^5mَ%;Pҥ$X*NfU!ϝ0&Rw(u`ST;4f "Qj`RQ\5 zqV93RxTkON.6k# j;@TmZƠҷSQp w8NRe)zISk; P B˹ؘuڀPIq $3LboBpKu$Z~bE'$/vciVQ`,>rŒC<金:퐜m66gk)< ,9E΢1I"$3aǒ*i©.41]9]r,K.bt7w p,zIք"AiK@j0nS9H.r-{W]~_-74rس/R:ҹϮ0?Mp,aME4LJp1A^ƻ&co.zZf7^ZY!7{]35Z]+a5H>Z0KpxCbh6xJ@H6)tkH8מ =sSy!7?VsVxkMHyqr/R4L@STh]ޮD<%eO>̓ `((FGBT[c3qƒKM@"xbۗ>NIXq|QQӑixvד'wqa #*o<[}g6NZR;a《USXX2Qb ( P[B@U@Jx'BNI,jw n2MpiP _Aȴ>R6*Auv wrRxɬAK4s4Gbv[G7%ܢ3E 'L : GBTf) :#*ⶻ|{O;P^qTΑIw֚=ҋ Xwֈaw&x:GuW'T=ҩRYys['OB[ hDɶ(׌ Q %[ wתWajy'n_zdI=w 2;sZѾxv1yVi6o/:a)푍&3!nn49FA Cz~'5ʳnƨ!W oW? :#Ĕ7Zda:e hG`0{yYc/nGhD3уp4zipޖ\=7\hw:& mk1tp$ZqC՘mňԝ%ݦIckmޢh{AӒHԽmY;k;]}AҥcΆ"/ҹT!DJ~,o޲t4=Jk7YܚéOaz~@H?Yͬ"+-Ė-Gp)e+7e/0]:RćD[a~afQ-ת-S>딡̊0ei= ԼÚN ۊ7 d(_%&5y?ڒ] _`nzqghVۥ eٓ4UR7q$|lqsZ!zɚmAј5U!螋 fRUzq筰È1vWykX= G`c2L9 D5^j"3"bq`Qcx-啩\/)ݾzWw󥪑nCxl/u^]$3-CQh?E[&n-i)@8z27i׭뵑~';M'wf"epw}tMKM{6\ofUruy,V`"$ yiB\U[~Vh0j 9ٌw!0^gs%Ñ٧Pl\)pY\'xI)%Qa=ŞU|jKEx֮+UͯΚS% #\X?ƯZ(3'.De5ۗ.k)&YOD4QƊEV#vejtP՚.s#)X5`O0Sh!lx$%aS oYhEvP;oAh5oa |p!߽ciAt!ZUWX+%-%uߚ߫Zb:Z@]a5Ar;kpп3F.Zs`}3wC6zB흷5ӕ=3~Sy+7'7X@eGQ*!X2` {9~F$L~$mLkw2 EsΓz@ÛEZPgIR} "K7ΛHR>108&Y҅*Ə7)-2FI9xT*DQAD:T|S \*r[ZM+=P]ͣd\7%FI~i"EF9V=ߵmS*>@cR7N₍%5-RcIx(%E(ޕ5#۵GQZkS_̟8U1VHRG~;M1*QBnkłrNFN[%4x{$Il3 fQk=sX0"*LJ<JgRk~sd*X)h"rVHER9@DQF *m?icm0@/Z;o_fs~c{guVg7ʄM_v/n!ogS3VY-DNW~ߗ&.ۍ }LKQͧvRj257Ef^>bm)e{ zxAPHV V !Vmګz a)Fxm{b[ AY1@|8C$A{ M>3Wz)JXcQTӯuiS"[H(Aq}]/}! Qt]QP>(۹Mڈ6VWqy|SVbd5FɃI4dFp5? 6ʶwX< xa9v9^} dom .H?d{:^a|1f t8.ߵ׎[>\;Mt7F #iV{r LI#V7>l5h$CAnZ{z趢bHZ-ۊG>ۊFPBdGNGWTz@)537w78 7'0Y`QɶmwOo{CEUfڭƜhM-%$F>jOB%Ss?} /{j{.{[p?VU΄_zs@j.zDZt/Z Sܶ{y; O᡺=͗y*6k5wea1C e͇X9͊{_ym2˘dNv戢n<0_L6(.(,&s ˃5Ob?v־\=F@1cRk|s2q~YsڱƸ B#Iڲ[cܤoN]Qoc>,(1-meZ ŪAֳf}5RCFu#мDE&.hG^vͥ,~Ǣ( _v(ɠq|a4@P+ulH1f59&罯3YgVޗ=~h4#ӝ6jiIĀ10PYy$jd>LDI"؀g/F?03gg/1|Ę'i'*b9X=?F~Dhg@zQpxk y]=;Wb_(u9׈_Cڄ?9啢Cv\3) Ütʽу(I\S9EAw9hvnQppʣ-t-^,Uw?w~`,L4<p^#ZÝ>g)1lsϟB1:&__/o8ϟŠrNãNNJ'a M%+*|4t8Y#1c ;5nĮI@Jp ىE$"Sqmq5z|x~7d2ODɝbC yR< *%8:Uc*EA]5Pç$&1vuAN}. ~ :ܡ. =b.vO9Yߢ00%k~ydܕ^-$(v[ Mhcct6ESJ0+(OA!/T`"&B@0:^41$:ӒQH98ȹl!:g]P[8u/4Z/BiW|$PKgeGpc)!F!kc QxGFxx:Ljҥ6aպw WK6 r"U^#*`GLn?~{ Y[5>?s O0>Mp Z=}1x$EAn/}qKLr-;AS1B&7wS0d-*hG+5+PS:hfsɹ42ndJ,ń`]qJ&`"/5 ց>r% ki@i`x/Euv#y`| 'EN3UYo*0Zx-vYԖA& 9H8$FmL <&*N&]cNn(YC(4 u:.-UZb +aO"na*o@QˆJF.pQ_6HUDV a t!MX{΢D.NBt868ıL XH5kgkm#G_va8 H0IۅW;%&߯(r[jI[AƱZEWlj+`\=5V Ea i)@aYxx 4iupa._;I P,gy%1U(Lb8O E[c`58=AeEɉZ^q Ft5kp^aBˢ$w 5 * ݲ5)`3 1{ȯ%gwH8+L05IFs87걗AAgkD#U FnһlDu( v2Xr&'{cD#^DF)Cx*@÷/50KeQcvkԟ  /SdGr ٚ$E9ywON\!'/`B]R#s>n04=0 ‚ƿ NE8mq ^e\As19̑GսЃIxӇa4(g%|<1iDS@'G4qY@TCGFP'dջ" QY8^Wr_r_Gv.;AXܼ|7qLb}Ӄ|lm=۾n0gKR&qc$&X&$ Tӵ%|BW>\ $^I`(+Z[<ϸqIK헲%Nꌰ$㵕*:hl1LY N=>WDž LAUfTY$bl S0u+)0er}r*1m&\ZH?+ۼ=`5/*6RWsf57eyby)bXV M)Rm;xii_ðcW~#w8'>ƚ,'1Dt1D u}6֔(̌DH}8">GɦϘ!JOQN~V# xg"Em^HI'q8(42^>J}n[$Q칐~c?QO"g;Ξ*$[6n0bPT_!1g0\Ԑ3 $_JN wv ˹sAS^uk!MM8*eE/vf;y!^.}P!:.c)Ϸ9 Ww$\QR!RU:;icL cquW rcX?X(꒘-E|k}daTIh[ K{#p,fyDx(k04'~vqVlN}MSIllLBZmiiǂMw8Yg] o_?ׅRN_0|8zƟ(Ĵ)SNj_F}V7 /+n;@ue}a7 (KJ{+1U7)%璯^ACLɬ} J|%qv]+wⅫX.x]4xݔ AGt_B@٪l=}EYЧSrPZIשO޾U#1Rӣ.¼*o%=~&#XuM"BgE]me¸; x, 7dXh"%H&.҂`Z'  =& m[L2#$"(K c ( dmU55AV&ZdߔV(D!Lk BF*S.\hYh)$c5$oυ* MhBz K[Gr4Z*J NMT%pc!cyb, (9i!'bk+@)/)pT&ѱO9C-^5;[Ϗ"Sħ>NB^4{\l?{ (ٔA:Cf(aǫ}I$~ּ)b䐕܇޸Rl3u-YqV':OiP@{/n#xUHj\s!OmSKME8?C#ʵS@7uaO wf4N3_&<,-Ҡ$0pzFH\)\v\XҨLfk:{eTHe|PwYN('tXKe/%'ئ8 @I"+L;x`J jD9R1O@ s\9B9_qan%|)"|&: O`brpr!b#j*8pY:cg"y T$Chq3+JR(I0 3dQq0RqLkDjTsԘ-—OOq I0īaPz<<Y'S0m J1*(irbM#T$ӈOx,@HD+!Ҙs m 84J8f0]%I)7`eJC8X2,U6uSPUkVb-/ׂ_iR"*JE)(XE٘F0-L/(Yٞ-k!(tYx@bˉ|!HE  IcD =&WbK|3~U@:ڎ3s$y\R$VU箳"_n,l B 6-Zi,>Dӗ} (gbY03Ba:6\uXuQ!--Bqk ]ǒF` ^=xT'e@Ipj p2nx-/&aIpf{SL03S)GQyi uJ}q /9cds_Hyh0i/s䘨E,ʂ3ƕ 5PS+ aZTѯ{KM̎W ͏bn#ПE =9;Y\_\m;7Oƃ ?~|?{d.zkgyw08nZ-ASWx:IF$e_*Kd}'琌i jħCDC,ZsCf0aƏ>4irr20ȬaL yOa[;Ak7^^7FKDD"! Frns=~9,Bn3 L{m20v"tD蝉0W6?25iմԾ/&h9fZq霄 aZH\Z0"(GŨ{GbVyV2t;&b~^w-~ʟqWҀ^nAf9rlRS4zf W &ԩǻ8 g/0nXc2C8ZXyU}w>\@ϭY3?u;''VSǬ|40+N}Tz}?;]6~v5*AƂ5u2֪tsN 8줇iv|J cJn^YGkf<}-[e2!")o}_x8y,e'r1` Ҹ\ܬу=()ÍQiQ+59Ollf_ yF119B|0V,7 gAG4 h%bQ>٨*u'+.in[ٲ.QDmȮ*$eɨ,u陇yӇƿƀ0&tχw;2r].= $]DO¸eFC/\72!^9)k3wg^(e.BeWP;G~SΉCn_c/}Kpyյ}xdŴ.3FM@H>P`[o\K-NF$wfN<޵PMNJ0RWoVh1lyFB3L% kHCjϕbzdSugs!,?B [pUP#:?7wS0<ƶf]W3@.,ȁ5h>78E^mLIӮ$LF5`J8x\hܜ'@,>nZ@uz_2wT޾Oɛ 1l .hr.^n.m"775Gʛ[y5.u'z>(E\}nbvx޹ufzosmxvҚz'^k-5V)uI*9' &jiDJ%U V,:#RFNg%sc!|yn+;\Uq%OW[O..K )g eA1@/vAw#_\݈ ׷Mea&ᶶyۖw }emJ_sjR5QLPj㜚ݳ]'1-|_Y5Pgnszv S"[ܸmx>MlMKKi(ժ[iIK1"=}ȋӨDAkRJM'/{!Lk3^xVDaر6+KYS7/@ -ky2dh 3!T$$?͎q@TW!Î'0ȥ4Jn=tRbp!⬁|8_s5&Wft ̯fY6 l*+ 1]]|*uueR4)3f|Ќy_VdB(\`賜,9 좢:3x6 eWq.ud=+Ȟo)5+o0SQw%|U85RR z$g" #tTxZ[N~-@)ڀ7*9|QBѸ#ѮyWiWdب,ohc@h7LX"X{4 n[y嫎g2pjWfҌ%\{zO*C ޯceᗊȓN /SjK)̫xZ̼j#+'}t\'Lх*qҠ,J2Fϔ:GPT\Sb3p9yn5_1OMƊ/B9@W(]+VHN] KE{U!Aj`KfyPܶrumc+>T>vPmL-D裂GH~o$+iT2Y.f+CϐȶUYYWw. r.=PTr9$Qks\'a(<||8D:8g̸'a< (pԗf(Һ~l yRk9JZOu")rRX9gkU⌚"36yqdF떷p^O6|Ye ̸Pȹ",u\XN=iCy^rBHP[*r$%}l) ֔ۜXB2z^iJXf J(ݏiP5\5b~e G- qJ_4iZ uj^k}Jblw؍BI!l=ɓKpEv {=߀F=^.TkȖDFo[;Kփ )DmE\!Қr6'l}hYP8dWK$8KeԹtONN? !j?3;X'W^s4UCϯ_hУqR }r@n`&hX ҩ3O[0 8Sd+tYtڤ}b 4NW>]0b˯w÷C#,Fea1DOXf#& n$:5ip&ߗz8|}rl8bAhB1ED` ֘4(fAB"SVakOZex495h|>U73<377o<fIp\L~"~6Xe4H"re:(䥕 q o}XNU`lygBQ- I`ె!ͤS,Lhn. ytV % -6\gD*1PpNb`zx<$ )@_cB wH.$ $0π[3O .横=%@ w`1</Los>-24,y1'q>|lv@oa:y vOܓ>:]W GԊKQ彽UA]O+ .P^Pᅖe\ 5%#LU؁Z:Hv2^Ip0%.T4 6ѹ"v yGwiG-MPt؁o'dDMcts# c!H2  ć6D. |U@2ܡ@(U#'B]("qAQ))i|+?>:{&QArUGa+]vt={ '$;TSrZ}` ڟǼ w#>%j0"*pdI88. d}}YH4 T9i1p(D{ Y-JLeS!ߘ,KP*E%Wn-,'# 6qJ^Q *Ew,_UEgqky-\}|E2]xB*T3*[]mH5*X`QqѠgԵ\LQX@N" K&:^Bk.\ syD p!P= ႐)E7UVjB9)B]OLjUI X" G=Xd1 煖)X  %nrF_ Yn|m bu*bWf0yOaŻz˴AYW/~x9]χ ?L}]xm EC?4Lg35vt.l3@O;Wf:{Q;7`6D&')18!`P8n) <>yQ\lySJPII~J*8OoR RϘH5wZ @fl.jD_;g|1\47W|6Oc:*$emvg \VA] XS&x  52!Z<:h1D$DlpJ( BছV| T&[a[;$*7c(.ӍMY رnW^K1_*IrbK#aH=!D2 k F)9@Bl$5knQi%hVkmfݘ K $FL;!*-C%w.jiLӌ#e, *|Vtn̰n'y䒰 'a A 1L UŒ\idY5dXRts!wA$jq!VJga kT$CM7H2JI!2"p1 HnKIYJhN, ^HJ-IoLE`Jn9IT]Ax7qlRbLR1UA!A"ZKd38Ŕ-~?z>5.buhI+qȶEЧŀ0=ͼqO ¶&jwTK"E$p-x[w.cjXK>(&RpK3{e`gP'n(hjb0lWXƲ(Bf1Xd9{=7kbCs|uFl]OBZ"Ken )_:)D1%Y#ͅgBn uVhccP9O91Ov2 ݵ,,ӯT. wej V!y #_h'A@U;^98jqTB`8]jN811 d(RR us2u,BPb17`\iQ^Q\mg+S9پ=s&k+7J6Ony3z1(ginڻU z:,MM)tiE& v8_H`L]`q-&bB@֤a.2}oH/=KzY[)k.fCCZ4B7􃁡"GF׋n]F:nW7PΎa"77kӀ2W>\aeiaݐ zt&s~02}.m=X{zW֗XV],UJ=ZLmƢ;f'ykv#R:X&L3HW2`]_r+jv4NzRQ?{uU.rƉO`*U◔un Nhh?hz#C*q-n$GϠjQ,B9@c\يdVV}ppV{Ӳb"h ?(&\s0xyW" Y3_8x*i.UlKds~?],ͤ:۞lc2mp&DEU,#m9 31V)>Qc61 `̩|V:?djWylx?~H/{?O(DuI!Ք#;2*)"]"p[vr`bBΣ+pCfi1äyif4UeUD\SHa\b}|Ǔښm)ՔօDH6=_:_FTusd)d-D StC}$\4"ZdI)aZ8kA$GG߻W?&¾d_ƠW[J۟C]%f,[ޝWcϼ\6 &;\מ TSVdڈn~q?m-+Ƭf|oӁyi~Wk=߼c|MR*YRI95ܖAfNJM^ga0v˾-RA*^HwDƵDET"[|96czvseW68+SW#oi6ԅ_"? 3qSJH;A:^="xzϟe?d'Gd枒~nXlMI"bQpM}/ڮ'oҫ~ąG I ;.1=!E(=bNRLNvHs:RBK}῁?h4̓1Xə62FQ"5soOx>E?~zn5 1 ȻϟE*yg7S~Ë_ ""B$Z _{p:/u~1 C+f{xg@Z{x6p<}~=1`\sf<ު!1:CEHev#!ҊI=s.ۨ dt/bwE~T\UB B&aXhcj%ABL\cT`J E6YǥmpF%hD 60pہSER!8\6>~NLp`/6Y#8,ҿ}w[.0 "[c"Az NK H[bh,.$OqY_ ͔HP(c8O Y&w 15NKc Nl0jHlA_2xPoNG6 BJ+g\hgp@J1Y_!69 BU`_̲_}sd k3*Rꉑ~%>o M +a*Hu e$^Em6w(v1&<')*+iXiMY_Z1C2d-f(W u}=gzT(a{#a3"%TaK( pfkTHwj4?Mf` Ak0oyk ڶP6 T|yKk\Hxډe$mbN yE=: R3X%$, pPg{5E`K95Ƚ C=ԢF[hX`gAj&5ik8' 36^JSȚ cY\އ{I<N8{M%~o1?Y[?<:1,hxw:co1y%)!|!{: LcoC= K70,? ?P隮;xHHi]N/#YA,r763 }P&8qޝ* Yx;iQcFV"ms*L4 d%ԱjD ,6U]^B؉ -B;hXH+]N>47__́w\$olridԷ@q@-%xƚEk$/Wl%4Fuˬ +Q]Ub!P&y6ᮜ Y#TnYR![z_vλqf* ۩^v*>膏R#ZL!"/n]Nu0UdXi[}?_V YLv5Mv3,DN`*U◔uV IDúy}жE wI%Ayu u*HLr-TS[)VS͂'OEo9T\c(":Wӽ/D3g}뗷Z,lV3A%("%\n91Rý#byLbIӌǚӊ).U8T@FsD2kkbbƥH^pk0kP1jkL?Mt(\ *JYu:î_IEeH` #ʂIz aH) K(6be(E$SK U~%Kv AZ;`&`̒s pPA@占_?Z)A&:A@H$5%6tFݠL?=y>K{t8^2 o5sreDtJS @*}ùUJI 0\_P9ȒEXc2~95)֗L Mb2XC 1!ahCiױ741C7(N8 DY҂R" De,Qy͕YO #Bj ;5ۖq!D0Fᏹj 5S\Iό) d|-Úmb?xH!gL`*)'Eա]u\J{.u-''+)F|!& BOZZ igЋyzܨLo5.räꙹP J2v9GNޛ0:?.-'ټ럾tL' +GZ]J˧KڠlBDogrQZn}jPa]5JU#U(ЫwGs; z({ (q+g`ӨK+!b(>WקӼ0! !0H d"a'\xF؜_[7M@Dq¦B9Y)[O,EK^6]˦t~%%2; E ; AHUw-FS7s+Ow}8So5KywN:ۘ^G3ow fJo{u!B󀮖]NKqf[dɺQ94Wk 9]az^КM}ˁ7 !;YrEiDb6#Jq/x‰sU˱F G,2ҚQ/4\0BqIxTP,EB'Cm}$*eSNF15H=`*?јpg$Úxk':D!6|sd1gz8_!% cY,cZuRUK^(=>XN*SJ)@4axX%I!|J2PJSt=uݐSdGH OʐN򠽄lrBDz-Hz笨DN.r$&Q)1p$p#BRt R瑶YRrȀj- GNt2>'?R{q2g]wL,w*yBddb$FmOpZ,(\ep=#o c`X3Z*@P| Wh[V"^lS+G ;Mت%nީf1{%Jؚ"!KZt(}>e&`+(T)ɷ@Fv,}-Tp Rjˆ(ZdA 50pY3@[,' 2 %0'X1:};g7 دkכ?q1Td1*9f} oKr X`;aA.LͣW WڳW1%.Ӳpݼr΀JJ=}QVKb̬jZx-u?fFu8Bxq҈5Ba?i=O0f4fKviqR!M(9;sseD/zmˡF_uenVoJw Qd;|cSi?CJ}n#& S\Ebo5G@IUӀk*INJT t 0ҵjNB9*8FGA$;Y*F ]zS-\2~JAH @fun]&!(gBJ.=:ri0huk9Է;TV$k%]Xh"__E6 gɢ!CوŻ7[ph@ޝꯙ1hҖo.ZU. *Vc7.Z79?Ɔ򼠯M\u\ͦ-z}v`s.[ i|5*7a5#7)WHF~+7vh hLճߺw`GnN7X#cS'nօm hoR\+kɒl%=^Jɪ.v'z*.'o;a|s\\",!kO3gMZYQrjY[袇gxnM>]rj\yzr'cT.e7#q:F?Q É.cZ3ɣ$5̌O=q4qnE֣݊f >k<* <0i:*8G娘8ѦS qp-!BN*oִ\*//s۳rY_~̇:%ƃpyJoQ"@|Կ{s.//s7) u];yzye6Fj#<2RaXT. 4;i 2I)bu!E{[ Ip.%\ IDG+vHvMΪ-w%7DdlTa<1oMU,0&0,1" '0Ex.Ej]97ztSUU7>/ILE!V*,>Dcpyxd8%E6ܧ65SLǿg.n4&A1d5h5 ‡6e| :'s :5Z&MUJxì0: ˸<9`p_~YwUd*> d%\x8nr,%#Kv4Gx~uMy KէA$jL IۿG!}?i f$QE($A(H"Z~}ۇF8P*)_?sRqM?w 0+R 0o9c[>`1&YI48bhEx ԜrŸ@oO7-3Eo*/G&b4xEQEzIQL% '0Q-ivI:P@JVݮnև^՗3Cy|뭧[F P/BWeh$ջֿop|T]x{#|?З?`ܥPY P;.C`ŗ N}n%o`@/,n%obTV.ls/yáŋŋ'hn0M7F7>v{G.STh%(): oBdNU@%ce!2'81}ZMKI $#M JWƊH8w6%6[ĝRy!5uFYsk>1tjMv5wE.d>v2ѿ~{4o5*r<4shHN"͎g }#tמOg+{W>[tMwRHs:8LNA MZ STZK F'д$#!/v>Ә{D7ٹ"ԈC[sdEx! @x,xGjN1͕Qo&eXo| #_9 46me,l ]iF<.L)#T9&{[6␬[nF3FZ;xSzKRNQ%)CI9[.7ܜhSO/9A&1ubSa֋˰7OOd U݁R"sRʎcY`tP3g+tk}絛PH|vcy(2xy˺c,'>`\=%fYsCa4S}TS&ޠ L.3/L-Sq\;BLS8숂.DE*(=Þd7/??Ys!Ʉ^6,~^vuacLۻW~?e*f$GkGiE[2WQͦA8w?N]./>\g8''} Z,am `"wYZ,5hbU?c*0{YDODH>.ibC{f#{X1nP1Ы MA.C՚>OT NGYњ6P ΏnvF9VLh*ְ)imO]"_j;$f \n<2coٲuVQ{_ReJz)"AK QVAi؇k~ޯQ'FОa 3A?y_޴*TkuooWj}8Kah_}PKuOfyȕf`_ݗbуA1R ̀3VGɟf قHuj={y(щ:Q QkonLc%3Q`Ԑ~#ȍ%\x8RܫDq!ŀxk!NXR{٬]=)`w.HBp~y/%yB%N߾1ƄfEJI-a- "2DD!^2` FGk_[v][sƒ+(ʞzUy_lTn\%)`0P$;( $Ar@f["{/OW??ɀTR;"bVo`w4q0 HI_&#/7&e˪(Fcxt䈐_$#3}?4'\cd}~,PX .9{?ûĚH(ZV(n;]$ff?'x^*lB97 EWZh5X2EfuBl8-5-h2ZoY1or[O w^aW.<`ps.GK_hxbP4hzA+^TvJzh i3㩠Vd`X¦ZF1GWaix{ܪ`s0?\,X>>P^1#k8-0uA:5e{2 f*O #p-m8~^YUYa4>XoCع!4IOgs;Zm{G5M-Df<g= 8#_.(bXF*RzM|٦n@^1I>FM uT j(#QWhNXc{ތE%=|sUSoνx`&F|W~A+^Ǿ/|T|B|_~rͯ*c;ADIF9xкMgќ$ڕ_hW~]vU_h7^u PԂ;UK9yYȢHRVi8C)vdr[F Q؁._~0WOr4Xc&6p1у8@>'s߁8)j1ozY  Γ;F!L4#"㠑:g-/`.3I̴Y ;.O;I:CqlVPd)1&s P#`rBNaN]N.]8͋3]YyRԸ7-?w|hKyϥVPZbd`EjK\܉df a)[u&:ȧ RJbp\=Vk9b+.t*FUܥ፻z^#Uw?kXF/&M`aZLʥP,c&{4`)+Ü!r.FׂyZ%ǥ2r4dVr6PЧF9Jpn29## 'g&oo:@~&pXR9cf [v> y?_?6{x2=G5{4ZN5<;h YI-8#a'edӽ󲙌M|F]2^o{rpڰ(W=\E AGWCA5i"m\?yb~;NƷyioN|ah{35}Jӡ!N1Lw?;D61Q FV\Wfz'4%AG_cPu FEXǰFX 9Idvʰ"c H-UĘHj=v*kUw~TGO Bź]1z:]}+fR%r&k}RvrA5Wܻe9XpJã\\yeyCRy1p `}gToI:~ mjg{{?^',FcCr-~x[m> JOҿ뙲ܯ긤C_s)xqN>wesVo!TS<_YJnIBq)+MZjX BD'v]-)ci@ֆ|"Li!-p ;tT% kJ";&v_v4wz NIk}V]a͜,ݶznKn?0zS7./)Q a5@ k6vP xaG9c4إӲݟZ؋1ꢏ1u:y1SPD37mY̯~buZRfYeR{씵 sJr!d*RUHҟci?.AV3 yqc5t0A)qYY Ě1NPĈMyu@4WmbtC-4#.oyX~kM]j!V2~hNVo@ڇqv,-jLw,gM:Z;RE%յR|9" &`^1N.xxNGLu-S"!Ȕо:)Hmt$#"bQ@oG]byl1ߨ!n<;Wp޳w ~9nY~;D#M¾vřnJ`v݉Pp>ٕ M,4:LD7dJ(l,dJ`,ou`h/XU[yX_>%<)(gcORI^y xUdx¦)is҅r9qd gEN5< C,Er$ _|>_.wYYldP;e2/[*ohn0|o1Sbkb=Bc[=]J3[73~ KHּK-êWT$c] I}fg!˶0).KYJ sX ʂ|L%5^ YM|*3؛1$HX,1 0pC7#. J8L ~QM}9E 8Z9d vQÊSkC8gLVW3X\ 0Ot)92&O_Y(jrk d@VAA1_Z73E4 ˸ę&B*ZΆЖFݠqALP9U6 ,wfR0Ui2M ¬2i " +m M &ky1m`vrYMVtX׊>Q]+N!#pȆ=nt΂7Q<I si`;*&5+GqV0*p`áUai|ZkYݿx[a¶HcӮRcffn--!8lqA$%ҘnLy|-Qc.ey܌GC6)\1ŰǿKp֢37??w{ 9݉ɐA07$C#  #t+S=2mȘ>JheF@wP2iAXH qC*bRK% P~rfqܜ^nHxA5;g+sdm װ$?cׇ$Gq53GRftOpK#̎[z-^[/C[hMZF=};Ĥyv95\eYӽ3M|Fe T|©ϏNkpDGϖkk'q+$hyqu6㢐`tOu;ju15@.:\g%~ȎE.aET fNєe(9ȸe c<`2LcBwG;ĥL^z)fBضu}U|vo$abaah{3")AKD(шbSpD l;+ Y3^t8{lϟf gzXXz8ԟߟD1 #n4B?GRQ8Q Wme+DW0 *D{a0E>aHRr~,x{wkf28TZzqG[&<SFug~TA&Z|旉cdgb<}}R~1ENEf4Rl2!Rb`&]3F-sV0# qLgΫkmنG{?uncgF)J#nO7yRp,sʬ(5P)cM$IJ3^lcK%RcLӲku ؃Se֌qҫKCvvf1;^o;Ga&[|~R°gX׺nRzy}{YLuU(5qzii ZoW)ߟp w~n\NI۔qN.n)&"Y3@:ny}8"#3gzȑ_rp0^E22"dvewِdf$ۭ[RwrqYU$uIޠ(r&HVY %xe"YJ;;nTtΊʬq^J8lK>8:>rh\ڞ1涨60dzVirh8[bHIj@[6o&MozjO&i;2YjKQԳKџ!felKO`Oə@Y$8 v;rjb6w93ʝ4xȹEuWi&X?߇]^RIKYO9QV)<|ȋ9nmɲ W&lFW6gGlɤ,o3qNss:)7Ր#.3%qs&5ZSȎ'%]]wz[}k06[ivՆNB~'x7@蓟/?'X1Q=*?=p7M'R ?LL\͑n< h#6(%#ۃ?PL$"Hm_m(? E+ɱRMIdLW9Aݯ5fK~3rO1u+t%#ѓ+Cˑ@A9$k8֝L}nȔ u5SRțkk[w+j>ۖ@M@(afUJ%hyC42 YûnP+ ah0*aESщ27Rm$hݽb%6d@?o"gd>3'L'3h}(Li,e39C SSԁJmG $VTIeS*$d:OƇ&CV~³p/vW6#8cn.H'iJTϣ뤑zZ3ѫ)am.=zzjaooم_IhKegg\%^py,\}B08^jX.ui~O;\ s=#8Oo Ҥ:he f\bo^}Gߏخ }# 1c7u5zCz٣T^k#7q̐ީyst- S!l+S\{p_.J\PztwZ*`Sw*U{Ĉ^z/U`p 'TpT]C,~!91bWF7wÈޭSڊP@J&U9.'dә̫$,gLdY\zcRSvu+sqczk[?䯃ثxu!ߑ:6omg/K|]\g{Feō6 CnsS(-ȷ>˖NibX gv6ZOܤ'[nͫbg%u*ߣ%QZ/n.s.G 򚳛w߻r6:%nF!=6ebO*;58 LCgF]?!L. pVBsmV ʉ!$(в\kG(qay3x7\M;Kņm'z385wrWnsn?.NRlS*Byؤ41e4 KSr)hmJy b[qږ敥fǘx0SolhR?-Jơ3 ֭Ic8iϣ07ʣO`@Vh81̰gqܒC2mͣ54ZRլ}c˗{bU!R1ҙB%lndʃ f(xOLnɡGa'nS$ EQ& !F>O-djуW%׽(O4(1~BCBJft Ic/yIc1!DD>kGΥ`xRg P5O]ҸäRf!ć&șwܨ|҆`6&H#*IfJTe$2ἄNJ>N3%р~[; d]oŶ[魊VEzi;9~y-;O6ݺ}RLJ1*Ī^Ll\qnWYePҡ^fХgn4EVyi\-}nȵ-\Adt/֖`-?i2L4”R1]DqUT\-AгN( 119iK1Cnbiu}$E_޾%wm"?`' bIokڈb}9@fy6(eQo4ss$̒IrQ(Sf$'[ȥ2Ar_]tB 5{3ș`VxK C ;t>J)at*d3O[Zb}7qn8a3 o'b + I/.n|ӃS} EF[c<"cqHU`nt[WMNt3蜐 =|Xų1{9u.\74&J,wU* #I\סB^wIJ"<Ԣchy%Z]y 9xK FAFIeƃgܛ=/QIr87}W; Gzwg#"+ O{8:?2TFa0\>η<ݨ\ FN1.8˭`rV̷2vZ^MɱDwS) \L ~HXpÛ,xoTP:.[gJyǃc0FX\SQ(^p|efr21k'XppLtfܻƂKn[PR kAhθFj^p3:CkNJK +RJtI'ѝx;kQ ǔ),e\1{:Vx01]Jke1ASh^P=F- KR`f(5"0*ex ebX(x2-iJ Xข$~( -%9-'wlzb $(i[6l~fXL `_Y Sf't ;}.DYUJJV"Dc)okN1NTѱ9VvEs09 pFΑp%vLqU $Vk`a1zpȷH KB-&D3ކ./xsA2[?>^3`8gelZh'l]lVx$ұ=AH{4Ɠ8SB -#Řn [w]؃0Jg(hoip1!9h)!;a 'R A$Té,NW(/jp8!YH!z`:&ƫ!&}pvcRH'?h!@_=PET.d,Q"{D28iZZڰ!4>dϖ`9k0x`Q'4eƋ ;^칎ό#̀ {֡2 |\BV3-Ex>ݣ916Pv)= d'v4V:ZOc yƼBQ8LaZ&IMa:La' ϛBM+~y0i) g{Bty!MOft3zxaWdt| fiQ|2շ8 Oft3wyx`4vgT2OOC`&lrtBwA̡vǐKDz;w3ðp G8TJ Ki-``i1*{:)Y/#}J(๝ql3G*śP‘a6Κ+]pR<S1.] ℵ2}UY?6/ǎ7#R 6r=I/CZLw}xK Eftr~Y<<|8"S_x74ɛPCO3]+d2˕XM\ ;. צ9t}s}tY_XyhJ#0͓_e"ϗRůVE)GLNOԁy;0X +K U;~[M]Ө5 㧯g1pzE5J_HԱzꎩW',v Նз7#s6Gt:djǘiϻ&˵D4ɿHwI1/V=CVBa*L D e&bh͸פtZdthNDط-FK r: 6REg?˦s,6GL%‹/%4L9,U%0mc;أ7?z?+x5kcd)<]_|tWxf.}GBRvȒҏU9Tn?a@pώ;?f7*{=GPɴٍ6{5{3l(BKGeִD2e`3"jGۧ[c9LʈM*E"3UBOs\u;OMKg })0=zkU-zyײ!y|!SQ?X#/1SL!LTpZG"O"c:a$9s"U"D*%=%Vm Ԍωq۝@mҔx?f"pB]U=I(pH]vcIeרPxpP₵]X߂˽2JZci mXPclgy QDq_+S$Z`P˹[Ґ$(IE & k`U8ƤhzٙCpB`C"뤡)/-1%8wiF;WX(_2ݖ8^B9fZpEmD1d$5Hr0c&9N3PA⌫sMu)2) &`Q-'*GbہV6B .V(K[a .ASvlYR #tFs_WB)")rQmA mZR-4dJU'vjOZc;w5Tv5buiKD֨>_d1l3Oܝ_tt iAvXd ݯNPjO?ݚ;s֧C `z`dqw z?h5O6Avy'm] moKj4?@F'wi%vy^oh8.*N ɤ.@ER7]Ynra=-%! t.KҕԵ"]guRXI7I"Y'ՉD9@Qu:%W(F8((d\ȣD yUYoXK}aB^ZW{r]shQˡA0fĀҤŌU<UD 2/](gXz%i-Lap9INg* r*/Q'-MLt+[NVf$3;#ײ pJE+VRfh=E-–PK.2)"| Bht.G1QNĔ(W!,Hy>{gD4A6i Lέ& nT_|8`6n?p.o/mҼN2^ivfTTheX""3[(j5KlMTy wJ#I!`?H*X",Vkq#34B:.Y`Qs/$nQVdnRz U^$g֢keNS1H0Uw?~קKekR0eE z 2~Nr~ |u6p W~9(B i9H[NnyKnz\M;KR޵XRi?ԱV-%VJ9czxpٓpyrQUO {30h#<ϙUqQg<Ζp1GIxΜpY\/Hy Q :)@Refil f2^V揾am>6߿41okR7]fL\b-q zm VڿZww"N!ZcQwdRэŕkIwQ`OF?2v*;oxי@~#ʺæ+1<1q>wm:2`~?>*kڼ[ fRWvaߏ"\ada ĈŁC@Xtޏ&Q@DK׆\JHyj:1 xEA2 q'57VQ/\6 4wB .Fz;62Mۋhf+$ziM,W A5^0j6aFrP<le+91ٷw3].R2"0D;5"oEsϐ2ie%^-P[ :F}(kJcnFZXWJkԘ*KCY XZ v+HbqֺY̲C"ԍNe"Y2iVJ)KvkXi?xV2YGo'W}ͽ1Y Jn[ۓP;^=?cGb" Ԥd[4Q46V8E% \;H簺;騤0J`U,smm-K[(Ӛc\(,,7| y3F 1 ܲ(wFQ`shoX4# T# NӜa^]7~zP9TU9*]R- 00mяR;w`$e^(6HjT kq+qJ#1wWonK \D2&?^^}6˫zzU* WOoA @A(~>o kAŰ8+*.꧁ђOLұ`&(KL2aȐiM9 15B;|, iAIF>P%1f)V*64XHnh <.x ,b2[>Aԑ+Ɍ'i3)"c͐G6HkGLXo,lZD CLʊE,%Q>@t};딳L"d618ưM*jwIauK7qf`+x뫃[~gt.Hsì2 aH)I%SԤZ %8Ǖ_PkFFci}-yGa>]!ع]]l>|1?n6ɊdM(/h?ܹGOTūяB*4ԽVzaGj^]} WdoW]œGOԚ@f륳˘hE,z2.Eo-Jϧ;3^(1Ȏf+pkA"R ;ؕC %RV?x؟,3.Xs2/Ԥ5]` {䢫`.1 {' ӁÈ37Ib2D0 5F0Yj1pJ&1 MlFݩg`n\Ů׮p|^{di7eGb (ڣ -խA[leK P`0-Ymۇ5 K2FH1$~OՏx!STQMP(8 #wdG;P^bśٗibC{< n ڳܚ/[bAH{1tx>*d OqKqsɞ cR[`:DY뿼`T}т@ E#ũ ZWM\Ow8PixEKe}=ȭœCx9 61Cms 9>ƧbrHI)šS̪xsU1 <vj]*m~.y?<<ד>W.Wܸ/\Ƽq蟳"̗l=IXKڴtg[0.Q݅)|YW/ AnBә]?.WF∢RUH@u.7Ӝy}LDW녫3rwNˋG|EZHO!JƮJ,g.A2%&x- vTm^S5!!\DO)w"%d~_4+kQAmNgz𙶃w-pʩǟ}\Iv.0O+qqu#\Ɂ0U {dE8$J@lY?g<5JRDyRvm*T*!K!. VsR@>mZ!Pze//y@[EK3A.P'|kAH-#$B 2cX{q{q2yiw<9!ȋ)igLh\-kH+ĔСe~`W<6r xN E\MWm%cn˂}'_좃hkxUlPQ0^!Ap{o P%zf>J3[`|w歞gs4FAAƴi #|UtPlG&5PpWGI']6t^KsG |4[?_;<8r>k;j|9^\NnЗn^i'O(-:4ƺyJ8M3ĸT֌UIHXDRűn bjem~ZjR] =qj6f ^ P /YEMIo<m R=JutV坞Tz~N+Z,ctڻljmb߯!ӹ>W\9r:O><}sx\oz _r{ga՘ ""Dre71hrPޯ'~Wg&rF{~ft ;hg( ;$"NmWӻ,P|G C..=b:F GSN; "oyE4HJji7 C11h,PS:nnMH3 Rzvv Ab#:cn>%ߧmxR}`A<Zb4ND"ٟ_7 UA(upUF~+AA(Yyz`CnMNyWoxBT;2[|-srV-rR\?hņdÔiG= 4akZ%.$tyvcAQc1 lb椔 ;YK(t_|gY-u˲'~Pv6HbH}HX~6 1irgFDOf>}gXr%#G+ jWwM#߯ȣ壶d-hq(HDF,a-ݼ6\I<^}_~r3sY_n"MO})H=ְ\k?ծm P%ԿQǠgJȍ_a& GJR7Jz/ŔK)gCJC984e,zn4>:BsԃUBG_lJ?S4 {hq;h𞌿FSEtq {7ɫFjeT_9iL˨.o[kt A-˯uQ/ͯF¶ ;"DZWmar"&Y?ŵ,ahI49 Ƽ,f/|FtrQ3'f?D]yu%,Q;FQ +M LҗysC42_E HckG< /.F|sü?j:hQe{b7=sƃɇ [5i)zR 'Ky/ECT\P,ܨ3)Mv-lT<[T>?[<{oj_Wj7] GJDu+/|`Q48 @qtI{DkcA $ȴlkD+v0D`cE eF֪$[k c2ئ]d,qsI+jUnQp hI&l,׹JV*bW@sL53)2sE-Tw(u ѧDYM6OkJZVElN(fIÜAxKU (!tjp,=\f4wi$KFi炀DDn4V3!VcPHF3ԃ 4N3i(Q 71piaA 20d)iXhGCsJ&XjLXZ,ꦴҢ\_x1猕:b)u:jB;}+Y`+G9H\=;`FĹ*6׌Iҏf$ND mJuВXd :0=@HZco߼΢=-xUCV;mׅcG.s#z~x R 6VˠB/oM\5p8*<@J20Z4Jt%Ui2i3{8LiLM[- ]ijH)@+\x4>0F"1*27X7v5@e!8h.7w$$5=\Pib@;&]ZB]0`  vYǔw讨粐Kx+oEN@A0KBt9EW/:׃֊CA\?C AP*3Un;R qS˺TN܃uO=IUG2F1؈$3$, , ;2^imLZ}|쵰 1Cl?@ MV"ڜ>T qq!b );ô|AB[zL "ug 1h+.vf`F߹oZR׏d/|9 hl+ck_G]|x/ǃOm?eje^:= bK#,߾?&V!9/{SmWfKXovu+eִWXN *[Dit^ݸylo#c`:p{SLLz J|(lM nēҶG,LB5$&q_SPpU~Ns03vE~ ~[RA7(x O":tʕol-Ad0ՂqRyR9+[|҅Zio4oQpw3Kt|E8J&"SՀJʇYnSڀ^w:G_,ɡt: _=ܧ"s;:biJӻI6Ue^2~yPbbUײMǘSꂍ 1^'bs*/>50GnxJkYYBAoJ$'sSQ$aOD R͵R.ULýCJaPR f||X-l^@o{AxocАۡ4' Bx9@LiYǞ ,/Z(tq/}p E_uE_;h/Z ";9gXz:2ƭɻnY,]4+>`)&Z׫)8-T֮bge\1 "\ gbmrhes r R%22 ߰rIe=]s7~FW*}8臷/JXej̋ĀF#6-]?籥}j:Cz4Iq[fOSP0rh4dw/!?D ܀XfxO$ Y{Y̙dI02<6d,Zkgin^_ưQ 2)lzbcHbKG7GrU@fTU8hfh'4 8p0!YS>'ԠtB Q%CtNc*XLƩt 3ɕ48B( 0 &+!0$F9pK胏S`Rx.:hIɠuB4tnMel7kV裒2b\IJ,L(ũlV9YAI)4"3\$ӀM29R)#N1w< ,ݒ ?ɸ.u>֮o}}υ^@-AA=ggs3F;$cF?hk?x2;)dd4KGW }8(~lro <񞚣z<{ {5S4U%("2if%D9)~~릏ټ'q "]R ٢(O+Zk9F: 4۹@o~B7b}38PزxoO|iX4Ř8 #c&xa'h]g|>+4dgZؗnLkXYTi3==[n-=t7:g]~b_RA00D7nН ̀6ᠣTl&BR@fXri8 ں݇\#txbI ~l3/CV4 ;<.)gknnfqBڴD skzh%7Hwj 8!W&ݘz؃0XJ ߴ;Mq%H3ohM;23vzOȀ0\g;8Uvx[`رեAHK;eƧNm8j[.t2%Ū.il%TPjcܻOlbMw+xߟ'Z<#YB/}Гqyɂ<+Q IO5ːz8eF,$D,cJL2F*j9Ò9Jnh` 3-R?3ax ̟IJ̤O QuZso^q_]tv$E$O`C;|ջr{bZ0Kփ9{2'r쨂JQkC4C9xfP^poSMH>.#Ta1n^ǜQz{;L`MDEp17u?4$Zn݀jiHm唀 ]\^IdWa0&%tK#"0<-`.Yuf,aMYk2|ih22Dtav䈒]rxVT! u Z=\y .Q7G G \7 w!q]_9ƾD.fv~!\8sIaHƼÙE8 M+]AxRxN['J+E0tx1V56b ix޷>t0u\ {NWjJ:Rk&Ň|ʽIp>FQI,|/uB)Pl\˽;_K)y(49ȲY ^(8jD 5 ? :Յ+c5րײ J;l(| _n-7qHj?3拐՟dXkcqn6D0טë#z=, ˕!s, IPnk|KFR>zwbaBFM':_|pq˨l Y,->¤'oGdT{?uQ>=Z4۴ g,v?`:iq+~ 9#gP㘸1㘸1.&nby9 tyyncPT$h# 9AX m1TӉC5|w NpG qZToE^NWY='gI '?*O}`U\cv5{lVkuf(ҒYl T ʝ;Itۈ,Z$l%PSU1P}#lOj?H-<+ YM av^  <&ZRǐ?&&h*ۓg7)( m1G tHmzA*eaQe!1I̢1!:MSXKUEC"ZѲD [ d f,j„"l )%t{0|!}5|k% *mޮ)XQ0F"`{9:B^!C4* +EXRK)zX<i[?^:w3u_Ƌ qj*N#ڲ?09Y)y\^<|~Rx`,,?yr g"B7~ xEg?#Ζ߹2ŷTwnn:#%EN` k*G/A MӜUc8n* L@D!|W*0BuzT\lp٢0ٜ BVSBNr`p~Ƙ aKLy- 9ClT,(i%7WBJ D + aDN\ 5>AD\pERXv#%K# .Z`S&aHy gd ٍ ߜ6sfINh&bdѣ2K4;X u-%<4Qx@H|yf^\pIQ L3Wx"eTs? kmarQX(rͤ RK!8rlena|; ЁrQ(dSfV0i&u~I8>MO+.?7f͛XU)yA"ad.UBA' rf(\%|=x安VJVA /Z0A}Νa$ 4=sF\/wr;1hd6װ&QT2XQ,mܝIbinEV̊>Ds6 NS" Ahj- 0u6⃭H¯V嘂G%N.J>O薎cʬ&1H6q DBmMcLx!\JU.f+RBuκv3ԥE$V\:rF<}r 3hL(c,Y޲AA1gd5QFѕm9c\!b5*9J(s9!8rK.@Fד3M\X:ri-L!%庾=[6(frpp+9mE[(FRĀH(^&jQpp# k_ZRoRS0uD 1/"o頰~SwwJ%%Zk/OYEt.tk)1PoӤ=ͨ[ߍ>e")Kei yflU$+0Q7§?3<-b]|dnf;-+4tt9wWw&կ*^՗;x{PkO|` Z!H;Wp7/-,6&.3ߚHvm_tA9ϭ:HEѣdbYW9^enirDŽ~>ôG-zfXKzNufHt >bB{ QGd+zkƯ4?,[`F@>5nYz.P- V@b%[_⬇VzͧHu6`jH2֢hN5a`TAWe5ugq\i)vk7Yd$ލ %Fb~acorok6A) u p:"zo_ɇ5,~mbg`ϛ/A ̾9C9+c=CxL R[:CmLN#ϴǩCvA ENd\!Bתe2:xrj]d~^̭̾O<=[V<OfnU1nn0:fmđsv7wW~4/[c ~;9bqU[+ (GUc2_ 'G>JŗL?z;9./)@ĸLx/b%=cPp+*^)z'=UDX@^ Q]lPpm ,[Gi)RVqJVRȮ>V&5`S c0W_+v܄P"ʤRW~ W G)~dV1QǷ Fփxmn6MOBq!c*^E$xn$Қ.x"%&Vr$TOh~eLJ+7WnLzrmk!nxow7oWOr2Zx[ZBl6$۱mkm nILdoow4;TwTB%Wδ&& .S(+sv 6: ;Bq4y>ZnlmL19+awU؀\l .d UEgIɕD*^ˢrهl3mgIL`4ZPƄ!23n \a.^sn#,&,(Q6[*l.%79VX!9~,oV6H,>E [^&,`1%C, <˕B8X`Lbazq._.PHd@Ϸk}n#7 K_r %-{uu[~-u+$&C#bf-F_N\]℔ Y>5EJ8 ւ@0~^oTc ;Al1eN8UbrUׄIiQNRP )2i#yes̖F䠂)-sVcbʰ!KHn)9; 2Yrurn+K85`}:r2 r Ib4Darf ]R2cD{rePSB~^|=|[<\/W𕻇? =@9c()م/R@JL$ͤԽuD73w#[#SOM8'"t/&ʳo$_:!2R̖7*cL+x&yGM~6I-Ucjo|=.v\Őjzoߤӊ16G#}RX5$l%v]9mhxL 42pTRг$a~_unԼ~QoZ wg+ %&l5qt5~)84cϑN`u^5&)8<b@!Qfלa% I#T3Un6UnA'&[=~'?@$$\e:scѪ CRNw!HKFCg3._v.3/j@"t%jtȵ 1t*)RI Zjte+aY[P,2`tbP&M4+pN2r!H4QFa3j,߻e\ 0naU_ nͬ۾a4k>Œ`cZDe5c7Bl5>6q=MpM\EYbEOȪӚ:X˧{'iV;_0clv*%G G~؛5'O.жTHvYZ)PV)ig{< [Xna:_]o17ue _^/:S2N+r4ikL+C+wS ! vOwzZĊ{Vm/?r%Y4[:cMc6閊A 뤎QG zj{niLҭ Y4ҷ'!A9N aLx=Z "eOQqzwsDzDS4=VסW^h+?,W޶vGsǐ0:M,i(21.oDwȣcڤ®V3wGb$B-0S5MD40l΂ Iq! ߃eS_u=J(s}H兴K|@x>xqPY<*kPcLrz}:*,߿x% c94TS{(\}e]N#^A#k#ۙ~$CR-Wo&&X}U:4< B2 r,aƗ-hw4H915l|3[TP31(пN~du=)&1UMJ_A |q}e-U/?38)gar;ڼfX\![d+)2aT;I#F/=b_exf[&/1v=:a}&_Ƀ7f!{?A\vIeJvL0@˕!T(;{Fێή/>6Y TPr|&0U }|ˣX:ʷa愩-2fFZEXSå^iNyEͅ1g%B0,a( y (@ @ \[X&e!93HC2rقd.),PN T/Hv<# pJi Eԁdl"UU"`?3$O.TEwx'w^Y?8 j7˯o.Uv/6z쳄Ń-/>syCY9fU%X-uDIaic5יgfp*T {j-:\_3lR7,g+oՙ@[$6!/9ρ&EA4 XA ax 8-@{6iqj`7FpH) $Xk(˹HAm}TpҤqX*+n΀OIϖu@"_xY;'-oNaς+ށ ws "LCowg5 >>v3O') sm=bsF-<'xhn@TN|As͉J }R#^ʕdZIs w^DhV-HP%9DX :fwJdYxd%)~fAnXI@H؛QϷ6n7[_`\A:ۉ'$Z?FI; F1L7a`O]<,a:ħ˟mWĞUJQYw2Rpp}_72˙z٧'{n 2RHPi-XC,O+мPݦ3 s@6%$c0+F  9\ZNt2dJJ+5l;^TVvKmqLL8&4ѧZT.2#:纠L&:f(;%Ù!$?T pTN5>s ّY[Zsagb^d=I҂ΐy'oyUQxW1fȥ2wݔH>9>luM6(%!4uIȨJBs!7DԄEQ_+Ӡtvu7nWqZ뭯jP 0VެchAL14MNM4eO:# d!96)!!:vMgG8EҔtf+Ի!ݹ@{<@*5>ByFF̍;Flym:"6Q@W1fȹ꼧qa[y̸̌z,G},RK{Z@Y`Wqa(p5\@Hz^݁aR?WqS1Uzwx){m7w#]櫔}m09ØK9ܙ`4wv&&m(ںѫ!8(l~wkQfɶSyb4+p^/?̂j M$risBO5Sui`õ qwSS>y[Nbq.>1xm#I_ 2f RptEDaRC 3҃rvO(W<,%UKRF#e:r!sVrULsD(rɝvt7ZbX!JJ)˱jQ)mw1ͨ23ЂU̻8@IԘD<옑׈B%r"CA89W=`xE}MؿpAt4/^ 4| /o \j $[=wPטd\Q-X~j01)yOWc"b4qfxh[\0tOSTa 9 O/qs<-P&y\j$^ՑXpU/nKiIi$ňG^y^y^y^ 5HHť2( [e3ZC,s B#-kdޘi 12$%^*R{:X QocZb@[P:`YHG!kfroZ@ AÔPEs0%NC!SA,唂v9@f•V`Tt3s a)\pN*)'pUB]Hz@RRpxD?Zyv5 4D)nt±2Z` ™S` B%i.s%wxLAIl.G ~0nE&**X'_qSډLa Uv",廬RL&TEFX SK~{[Zk%}{5a dkꟚT+v|u%+ ~'am뱖x`ofXW\^%kyŢ vΐ"AJ@"s!r%TlrZbG7&gFN1A!$s6Wv[VC2]0HSVȔR?'$V`42g IV7tϩխ3` &/A1D(1M,FlD0SZTa*J@u+`v~p [b(@{KDI<SRKf$qIBG.JcUyT“f]ISNqS3[ZN#|#$Lo&缣ҟfS[frf\[~2Fl9SCpw qp,o_ڲZ.HߟEbs$z&{ֲMFtLtl\SlT *`z5MUaSJib<)jAq]%. S4.Q+L˜;fXWվO4٦y{jEWǾE9'>r:R30J|#a1&tF{u* ='Z)ʱoOSxK *]Wא }l׭Pj6Xx+[/ª~Q_<'u0h$wOm;AʛyGH3T(VDQ>)Oi}jn EYu4k-@8U&uy 5UC]ݥo?!\2tNͩ,P'8.gTr5+j$&'EL5sWm֊Y! RK]ρ/uWs>8`㼒Z(H}Ѡ46^&qJ vm<%qݜoJXmѶA^ꀨ88[:SvCcS%A^+vcKI$mzi+3nTjsD;){{ѹf6UzBXR.):譠{tK6;ްѩTLj82ZN__8U%PD?3&"TuVy:7d嗯7{RCy+PMN5IW۱N6o|eyߵ2nW ٯC+f ![ !Z"KMGqيRTgՎF^P'WT# k4so V$9W+ Qས_$+zջ]74ϯ1&z j=\NoAW!#=](d "熴蜀pRIL GUFJZ̝RkVD8CT' D[$TFB2h"; Kr:L,/^ %. $1Mp\ /04C$H@*{kdθIDqw]vx+wɌv(&QՆI3%Ҋ) Mdj(B؂殆,*dzMeӳZ3l 89-5c %J_-@H +6m.E%0Ct]üJK8шPo;ۥr*Q%F,\I8 Ls0?2qJ䌉#5Hd+cW`v#^u,器>d&,RR#g/b+QE$&5DǢ7$QeW]#MRk+4uTuLS<(K7:Ʊ@S-f#hfmBkN,vVGٔdR,z0CL0gcvT *y6C VOSwq彣`8or{<z͕Z y >ewWBu1cpb}͵~N=|q}SuaGCO9CbkЬqn?}*t3Bi{~]_M1)V] '_ Oc%p$D+a{l8oIԾk+(g)ӌs@8)ar Pp.bG-fӃ[z1NY *ZR$79U\e#$lc 縮^k" ? |T+nqTZ)VQ) ~\CV!qŠ7qg0K0T N~G/G?xhWDx\^qyI.gk" } p S "`ۜLpP8.8 Ns4Z+Qp,O)zVz)@`iI;2 7KJA+XBJqtEY\Y^j%VU+s!iLtt\\^j=-?$0˧ gh/ =KZ}sNx Lk(O~fƟuG5}~_AU<*Ne)r7=rP*#Pns"MskXD*&RkΡ.Dg:QU-PjUr#/k`$Ex)% 9 | Ǚq6yy<>?}Yd_?8I=ŗnٝ]6dQޮ?x9xvJsBbq A6sJB(Hb+0(g-92lj$`sJ(LM zQh1d86Q 4IoU(" 2т10Hh(*DvQ = CԡRqDֈhްTkhfR?Rȹjk=25#~)Ő| y|@ l^z(  X;N ƌO PhV@" b-["5'q:Rmpjt 5P?/=D"z|Ui$;%O៟?~x~M|8?e&d\Z~g?k*Z+Ӏ~H/w~}|1*<Zm$e_}U]u4t[*rwLj^ /mzZ͞TJF_TT"h)Z4*M=jE3ZR rN7XsY* }[~7֭Χh91AxΡizwTbyr,4>0F#bɄ&a8e#TJi1;H(]7SzѼ3|Dw}|(W`]4(aŶ7id<:od@[VF(]? TdkVV_[ ](=N :cOpз7T<뮃pBI+,]J1]m'Oq^)MedUX/q0PګrHkmcײyWj?3(P~@m<1.q!9[xTܠFmI9!Rا3u(1WS-3H*O|0攲1R;S6s|PT $1+[vBofX^@7LЏ&{l6nMᲐ?|q4hkF w k%8y$oZ Uޕٔba>^ Dj7 SJJެvkmI/Y:~$s8%H&_v`=[Hrf߯փV͗lg2,Qd^bv*z8]Uz@sGS9m!\3/1'zqX]m&[iוqN{V+l45Ed:I @F8? oX %H,eE"eF0YTRi@1gaY 6LsƐ*S-!-|d:l]f q?OJ;Y읟Oz~Y5viR!xW;hVRˉF*/?'@Z"ן'=|~]0/ovfˏ~HoDDHsoO&lXjzy~;A:/~Z O)\GaL'7?Ljo"!kܔQZp$z|–4ü᜶!ɷ_sL^LJ#hzKo1KaO!&V5higwЁkXo%.*Ddw ?!zM',IfYVn%36N BU}hVN0mgvj1/8=Al9xry|1ˁ: Ot O)(2~A]^ "L1[4axO2Zjن^}:zw wu)82¡hpp);8z  EWq,p)B ;A&QvDVעT ͞TrJUJ[ X   Xa+6nZG j/NVBP{ KV[)G1Ř޼frj*b"fϥ\\]˫W'va&.P⒌\8H T|%>e~Iˁ!w6v>-S{etd^``ΥnYb]],jBAaB{NGN}Fz0߮ )_PڮHP ;ʽ.râG38BGx)S0děQ(Qt2U?~"vF׷M 62)J R95`UÍ-e]\ox|3DIBI"Ѣ B\!L3R[^ Q YE3"iL ᚥFqxN+e^8h[t*Z^iY nr8!s܃x>c);K4`RT"B+d`}jZ؂!(O/I){9(9;K?B@7vy`V*?\9ŽVNNSr3iN8S47)WdNQR3{r6\cdW,Th1.6i-[sK"1w ,۹wP7@Ry[Jwo v,dY_ p[.1aEc᦭ӡz=p)8Cv0;<$悶S}%_zČƀ h9 nPY4rDE9BLP(Dzxi=CҬPX ѡ -7qc*ZGhXl̸ڸFYiڑcCPL+Fy^Hf2(M<1.>ٛovMu 4j׌XEY4Lͩq^vd\a9I~jA& ..Md6q`0)R퀪.w/ngDn?s=6+xѐ{4?9ێw˙7bpnnn| b4x8:Q jsq?sxJFn_KX8KWśBKYSǗvr% y"#S܋Bݤ2m@A>vpj՛v߿j6$䅋hLam^uגX6p+&8!Ïus~"8*=^OX_ ^U3^I3\0f`Ġ̨GfP: ㅷPE>jᷳ>7(g{Ȍr# "G2产g=A M}CSQ5wD\I5kvT_&o]h}w>d17|[1f.3y#~BFut¤$vUzWEIhtA(W&||cWy[\H7g.Mc}BpŬ/Jn=Æ}=H O|t9L۲ٟᕹ_MWpō-?0t|&ٻ?$3rcQ+$`&Hb$Rf6&KFyI;=*5p|h-ͱڔ1(Uݝn+?.i Ttbl--IReMR&ǺxRhgY 3p}^?/5^ƈ64v;Ga_ U_F29Sǂ:(Gf QBX?g7iAWu~K_I%"˸R#ŠKBb K LAyʅ2&CLD q6yUevf_MVٻ`f?2Z_vʕ~xc>*=ui>/^uHvJ@0ae*C62*sa*2D hedJ@ Ίyg FEa% Gչ ifN (4e)6FlAX%a6ͤ`\idڦ2+I̝-׫ j8Ϳ_,f!sqB+ꉹ_!z%ן֟'=|~]>y;G?_݀zOoDDH{V~@g'f eW' Bg_;|Ӫ\wnW|NzG>ڴKQPoOnSA a\?t㞛rDlsNaLr9:yOv~X_i`e~],:%X=SR7e(`&`X;_X^]]M4171O7e7'B/f{XxҘf>TؼŦ|sW_ Ő{x;ɰwՂ\!HY{O=ƈ-|$ptP \Qc:C! FQhqaתK}4D㚛Wd8G A| PK:]YUh^GfyA29PV W_,,V1myF0Za]6L b6-mjPfrsa>hSO:[nX[ a<.nЯOuү FbcYǪ{gf!=sTcs^3z Þtlk3q%&k.Нyw(&ifHGwkN5ћw^bˀb19:O%ѽŎ?,݆p)!Fw cnN3h!YƼc箵[|@ֆpݒ)(SDrFiM{B!iffd!-+ݰ"v߃)FƖuloܴxHTEG3(9AE(#A^ZIXw$VWϮGd۪k(O]d -gAbPJ*ʘ"M2Li)`x}!7NmcZ#(Z FŒq. V 8;?s9PUacL)pG`*q `:R|z%5 W|$MUd,iOU^6DQT|f(@S6hMg^ *aR&j^]s9یcqٿWbs8X #,{,SwjN4QLćt/{Rwsk MB0e* »c<|E@c[%ꯆ@s !z6>q6tqh %mQIV=e6 9cdWɜuX* dZRȃ:Njj&[DkqVU_'3X,z1A D*Ҷ+A%lK%r$*x}!9ZCI xCt(0$g8ů[s֎Q/cFABmc;H:-\zƱ!|S6e86J؁lD#J3m\bv#BTOaJ 0jj:}!Fqe#)C& 5%H2* ܉ j㺤}5A3H~쏗盧k7޸ =7w_nv߭sOtn;!;7Ou mGG3ҕ aDP 8&EXLdRqQ"R.c(v ďe&&ޢN9:Ts԰[J͚7ϪZחfpuM ^b֏{hO9-ApNʗF͖,۾ (T"(UpBw7I~r ݛ1eڞ2ZP2%5]C.,y "ue\]{]~fy˛W [9sw=b5ۺfvMu̾|l}AOAhڑ^Fz^H.q&a%ݽJiMfףH۾p[Q(f KZ]P /gKwBB8}ASS k}=!F[ sH-2zOPa%ǘ~dIn -dGUK [TdPn?%<,j A8J :-sv"òb++Bde ĴD?\yNYs.(a5Wy~|-|GgZn~YÈR|ef K=%ķzq*TuJN㩛OsU}y1-2S@Ȥdy&.jݼ p )e^!Z F ,P"C]OqDaUo~:%;xKn)bآ%eVrgh T&⪴EQ]r)(GFp\kS .R( PE L>s!p DD x0"ylf>S˽V|xLˬlmgkz [[_i4w W>g?M^&h2xT z;`SVg {vӚy轐nkdҠ Ni<|Y?}{m=uSVtD٢GV^øz*2x|qp:Pl ۊdΪs(\Q^S) Lz˼N/4ded[ژ.ھ#Q(Xkߖ7|dD>%nTN-0]i*LMj>ʽF'pSm6\\= 7om$8vkgOO22IvGG,<1p.F*jY߃`^ERo.oI"k_gS91F DKʁ*0ÒXBSBīVɦWյ`fK)FDfj[fϹ"g_q1* Ys{[TЂFvV6dCg"V2ԹA4Q˱݆4ۣF$Z4O.MEgV{ЩV7RZWu策1]*p?TG^nufCvB$%&17v &Av; VؚFJJ^֒'yN*qJl Jƫ@_|/f]Kq ,haXF 3% rT\6Oq_Pce4֒m-˅g4^4qDuE$AFM-\>Iru9O dz)ټl|-R_ yT"Z\bb4{(d ¨{'zzPjkgOm˧V|YDQ7nMX3{3.Oswi1~wO˿8~QL[TTI `ᵖ?Y}#CoYT2/9rֲcJ#[>jBX9;3ȕ{˵˗+U^ߧm[oc]*u5zVC9>eUξH#Ѡ1U;H L[(>C'>~;)ij#c2ABW0F#zy{"i"V.aPS1rZȉx:kOλRWl;򝁂14 3 Rp`??7C L\w;'UޔEA5&jw)4lA BAdį~l0)#ֻ!3km/oW6 ( cHD )HiHL `H8‡*I{,VHmJ2[f/E+(09C$e{K]?_̼(?ezE0]9? %洷,QTϡUQ)CE 3ZԽeJpX)z(P4ք#2τZtR ( ZXGf"DYJ1`PP.~ F<͋/ ccX6N/"a]G^zIggTAPH#ND9.lvu?0+dRH N܏Z@D9Rk<^4¾iX^?[T  ݞ{@KlABUYSx @r1&\ kXTj6 sP5]k(Ө#q*GO]Sl3m&$W3e1N8ǮA+7ϫUw˻_MbbsP;xO;}?1f@Wc"Q}5((_tj J/c$35쬆d(z W[e=H0Y?k8g acaf3Av;6-"YucGJe JMtA*PDuL@i %,TE T! &&ڴ P&z~U/*Gj 6k/q$Xb%Lb(8U"/3 FpQ,u.Y(ayp%]Y!{bK ϯsz0·3Ix;ܭq[q@î۞@r}uBICpMod & r.{!x!YjڢV[9N&6ҩtn< ƧKN֛v[ƧIS {$ٝ9}Eso~O1z\ {Դ̭5><:RrHl"9kl6qHȸER%qe 1 ( @hʔiΐrմ!09$8k_7ٴ|j{w9C!| C1)Fz{8@N]o86g#Ѡ1%#토Sn 6h3F~$ij#р1 aD,`b2;ևߗhԟ}[Po|"Ǯ"y6縐hgT? C ˆW'RBAy.9$3c`7<纎ڙZS 6IDTȯłC8p|ʡ:B 2씐p !9mtgg+ M>Q@[\ft3E{'ҹF/$r k@G9 N 7>_=vU1RI 5uؑh vew`@Ј&8C>̪Z>h|u^?nu>8+Zzsطkyn3+cv&8&G].eX.~A1$(J@J Pj3rYNK$Kŕݠq)dnzry|yBwNІެO/ mok/9O& d̢E.A~R:|< ƋTh}xģZǚ"l4䚹Vwfo%R8.z˗li(RoO?,obqسuG\lZfqEvw6+u0% 2Z\}C{yƅCzn/V:\O]^ :ht}ˠs6gQuqﷄWKQ+As~N_oϟ]o޸6= \wo>[7}D^W mtZͅ8/ZeD2+)캐g $t}yyt )䥩ס~lTE_ע t*q.~4Z.{/1}~ 0YD<7eR0ޗ+MX[er:n J1^jF0ؗjP!n˒]*TT1,^՝ׇo}RzdٌL˿pk0D$*$Qa$)* ,0]̤ocUͰ& _l5uX"(F M(а"II@\T#N7OK2c XTFCkmbV8%q\Ba$(x3uՂ@ lہގSzޣknQR1Bj}``3ΉY, LKJR=g(%%HAEq3+:=ǻb|Nj=婕i9E;9AKO `GEy"BiETAK/Z_uNeO!)8F D9f+Rł;+'ډgd =\j8;|zpy')I}5 0=| {( ɀU0SKmjG֠z4'-V A~232[G(_ .J\L엻ţ+RVߧ +7pqo?6 Ȼϟ ̳[?~3Hz^HX4 ?3"ޒO1D!|o;c3_|~g껧 ZD6?ȴ/kuE 0g AtvQT82iVNh`8 I8TD<̨+I([^γGY_L|~ɣL)ӎ%L{(X&M9Jj+zc*us}mZ1сn4芩pDRS*B.MwFyP*0oۇiQِeR# J&p15A+ T;lL߰dEWYPlcCx4Z?l+7 RQEʐ0 dX,78Yy!X޹܊E醊y5~0-OxIE$<4L'r ZģaVoډR7byc6M6nQ3o'K}>Sg3^3MkDG`NqBF%N.ss& JU>T[uؗ)|BZL=vZyFy者ˤ .$iKfoYeQdY/@eA~oFG3~+k0߶r9x!V4M^qs$$)mԑX՛Ԯ3'Q Z?خY5+1 V :^/tҧq#DƐag @$ATVMT 貔RW &C\?L?Mw:V[׷jƱ`%ΘQe3gsdH.UqYNhe:L ^uL,(Ttmp:J +-/&|}[CQݷaޱ׵]F# hFDl3xxiR6pjyʍ֘ H)kIJ() pE 2T0DjXEy x)GBIT' ӇRbGxC4Qv8lYh|;똽C[> KIEjcfS)oB~QA7nٯFEZiNZYvD{yHo_sCnM>ϊ3 g'(P4!pS0똼5T8Š}Fq8-^=nhѭ >%(->u֦7Z&ʄj]XX'5g3S yJ*ss{{jc.YIj|XEYZJ $V@v^Nnz醄hĎou "䶵bRC"4u\'J>3&;Za.pXfç;2e^9h.Ez StRx L\wn?—8+[kvvEo^~J^1!B`ረ&qMur1[Nu!;veqvh!6g88eݱWh< %$)$c$GcXP,oT%S)N FbLR IɕB $E kZ`LB bqr~ܙ\X " Z}/F~@/ˮGY<·G?Ϡmy]^u334.22W/&f/ךl6(uR/\U\fH.Y͍\DU^aU+cI cN%k85*DƵ=<ȠcdC4hԷ`aeyrM-o D4X&MG_e28ۋ^MQ%RL^V-0 @a$i`[aҊTch`!&B:ȉH[n(rȢpwS7E  5pI5p3L`)WKɕƚ)Eˋ@)i3h=EZJ3)5y0ib 3 K:B2LHacHsN$0ăr/1V!MV6:170BX ڨ(oP.^w(]ftqd0uL˜)DOTY.ăƈ&<0յP_/ng5$|ƤVQaLaSRpIܩ &3VPV/wVހts)'26BIiW4Qvr0|,h,NR/p 7D+B.b~VRrb{>J:+ouS, ?@\pFu7nC"<3WTq_g ]$3UnJ7I5钞 Y5ej*q}u4YG$@QpTI >L'I/y w_5`wf\>j&l\pg};D3{xӣL-䷋Obvs} roޛ>Ffv @75/&rSALFNJf2%ԧ2*oivf]P帀@DG^+DIk%Q2vgrW* 3(XT^; V% ?_꯯`t{mNLWlzBDw&1ÜؠWrg}6Pge ;J;uPQ.%e}굫—+:]<}@»CCt.XS][?v=Hj'-"@Ev¶ X3h %7չfVs2o*{":^|'E87M4YS|,5#fN(% LztZzF˧qD88HtDR.zA׀ E_,8Ya@\ 0k%0ETD*m+O̙E%As𛷀4 l&2Q8:&( 8FYTq W*A QuШRAMu1 < y(X\ՆhULp4j~Č!JѺy 56[$K))y)2 (9Ԍ'%QVe#SPȔM$lyBGW0G dowъBzB稱PȞ{D^/s=Bu!uJ Du0veۂ滈.*WʌuvΣ=^sÁcbp׫`U_RpR1M,jQp E׈tq8 Yh]9cp@b(W}:>5 h5}ji)muiz.{4]L9RYzpΣ28U+h&|P MOPj~;O+g6ilKt4OGӼ;x@:YI3110wm&ag#y4]J=G3M4[#cS "ŝ*^7\p)K VL)hjR7>lr MNDKY+t GYsb;!u~׏["Sywf:j~p> 5MidaB3X)>f(ɵ>zNTCMS<9 "Of$V`9-j XuYȈI{VzJBb;>!Ʀbױ#)pP#&^4JOvYUm,+WOay_4p ]2MHBaZ8OM%&T]m~wZZS]^S s\[ٽK|@>ϋE޶)1Ÿ:C:;r]-2n lSIƐZKw=fh{ضwm&u@4ht :V O0X彋HR 9dCyBD!9 sKt)A)D \YLީQ3-*OJN~\ B ӏ9XRk!"%2e dUךKIj]9&*)7IIG:e2ĩu.)mk A YbQU.E V.whJƧ8H)IMR͈s ѩ L +E" m'0We(sG^KYY];w,;6Ujޱ׽cұ2[i0(ۯOG* `/d=Q#kJjT@0Q' xl\iXhXȕUg4/WPN3JT>B1j_HG7ZꞨ=ezZeed+#zTY 4PI[>8 ZTrQk`Ix8IԺRQ\oHm.\c Hཱིt?2rDwaP=I1j>r'E/PB!;1NQFPO >JW6OĿF =O)?B Kc[uM#uuerUՕqcՒ l=%8RƲK2X^d| $z$&ΙЯpj81ZFRFpb*Sl.Q+ǖ z'Ȋs,k}fiy 'Flu*C4I&*9STf25pi[sȜWd* \NKVFNKK|^G8?o ΫϳWg>^}^53H9"WȣʁHSJa2`+`[7܍4+x4kX&Ϭw R0ja I͖WHTlZ2"R18Ia|(}IEjˇ! mO K-'5zw=Ź0r3O(%S-6R5jN_./L0]07T\~/5RԬ;3"5kl3#Lj9!5{g$8]ȷ]rA6:ӫ|6 w{nv'uany=Gc>9*Ɖn. zٽfvIg?w|.Zn1u4~ ʷX\Ȗ|k]K-ЖM˥>ےZiq :`(R328KlSjQYz)R3;U#/5ǠY%gI<1f]K*ˍIfu jrȳEJ7XE$ R.E&-8pұ,E^j5))j"AA8պRkig%'ժM-gozH#\݌FlSj#ǵ8 2/TK1 /}LMS '/}^ sBjˑYԘ`<+z~ ίҘ|5u6L+v@DjN/>g)ծQ"˼ V2"V")l|s\F(.?\\On?,;9~85ȺS05I%S)~*0*܋b|_ˋկ^<ݯ: ~r+S`QLe'S^%eŏ _)1ie,jWf[-l RuHz5)ٝb`MWS9+-C!pRy>: &_ۧj^{A(=&NmTXIܻ}ڪR/{88ScyNj0D;;WbVӟz#B@8.ePO5M j}ZJY&3[ TyJ/#aOEg!vTyC. 3& J5ıWt&9o.BKA q6)(Z,]FG>)ڃΠ:ZDF7DD*"TF¹ Hrg5c`q.Qݓ=x**; 𬒣EMIʭrFBV: .iq,Z6BMGJbkS8E(y¾Y8x9cq/2Mqn-9O!K|>RY{v(,)ôThX w,8N eEJ҃3ޢ^DT>(YCPD>Z6YK|+h ǐeH :u62_^9'Rf<*ȵW^TB0A6xL QB;!Weh  b!>SR0,6b )VYI&Ł3 O0e sf2G3c&e +;fдI%R Αjz)b܎Y:S) |US)V.MIcaxQ}YD!*TQ5/\]l;~@@T>RKU^pH>"$kH2;r)Q BRjSi3/nॆ@"zRSYZAr*3q z "2L#Nʩ NdMEEº'_򄝃7<.`mh?ruޏG eWd_e%C!%Ĝc/C8E'Y;HHOj(! ŕȀG)܃űܒ&Uiou N 6 fQD*4xmW6h.כwQsзXl2BrSIF7_|[_2w-%?2ՏS]Z vfhk{-n1V(c^[nV3R)un@I ɟ* iРJ MjWtvi|{7U!/@5V/#b=\gn-^}HZN[NG>Rm t/7΅ԭΈwgRcjJ9Ha}o>NVG|2kl59H֥˥O)ժCVrC[qry;WvUʆn磉٦zҽd>Oz!??8F׳1lR)wg=yw]|MnݺuIt;^lY
^_|;͘['0'a9ZR&9_R\4/7kEpko5Ql4Sdy?!O~J^qqz!l޲&y(bn1ff8L٬b7#IA/lXW0XË݁5ErxyFMia%u)u]o]ܜNpKВ>yZd}Cuk?w1cT/9ni!j͗FJyFyfexv3 Gs=2qy7ƞ|~[jzT"GlaފAMw ia~eGi-ۗ֕-̧->9l{'d[+5!x+bS#n]*]ŧ$Ǔ_QF7N"bP:ݎK0n*] W8SJ* v>l7fTqΓ+~w>ߔbWoݻwgë=m >4}Ww H;i>|P]L|\mB'Qݜw2y]r5O8@KG).U|woB:@b]8%.YO?l|c>9C]ˋ+Edv̴Ӄx@?@ LG$!jriK1Af21^j4"El-=D,!ZA#w=cF qhL`q pM(&mt,ij;˴NBxpPZyKuٔ c Rs(n+mIYZZ.Q|rZ0ߝ4˸Xؖ X+O8үOxӜ@^x)?`sN RC7HMHړHk^cEL*KݤfEsA M3Ҡ<:h7H R2Z$K"d,AR@h(լ\bR3UT 1Dxfh,:BނN gȝ hB:8!슁5o~U`]YRwL2@@O52P} vΌSn{F1C$}-i=(~R}7h^^qWb,K$26RQ ce &)/7İ~~8B)t-8;Fh{$N&Jlo[XRAv`r1ՙcG 'r[ H:fO2l06ۈaFay8KN*F ?D!g9DgAHjR,@:J畍]v,8NɜC3M`(e&( (v(=DkcC9NNJHJe96`XyHŠkrŲfDz!&+)q]G)%24NQH֕1Zc*'Z)pB+bS Ldcը+mȁz?*—5!be^,pe0|jih엒 4<Ϯ$Bl-X1#Sr&pV=Wlx=J ν+뙤YYql7~䕯7~\!BYJnp"/&K+~!2} _o?־rпyTa+*WjkuS?-3T7yDRyZf]'wWw{=0wopd2oC&L]7^_~+ J -vW"1wm ,*>j!&4((V4 \f!ٜ?Cr䶰-0|ZY<;XIZ r{f_kbpXUw5rzŢ'ӓ$axJ=Yǖ  W .zhJXB B7^܊ŘArZ13y+E'a#1M`j<Bvq>S _\p6 WNԼ5r~(̳/'x 0L\K4 @;FjM~u/㥖&b?@X0 G6m./5"k!Fp!Ȥ8t$h}ǐǛMl ԞT6͘C^#FV j\CbS\tk/nSx w&`z݌'5bP516cPv% \.wkWM!/EG5Ԝu{Ob<-V^o@b 5Yӊ[;}vGyp{ijbRm5 tUWH~v Z ȱ3 (;#uy_ay¡7Dy0u< Eޒ}IޕF2Ћ#2}ǀ ,zyZ!rfvWܙbuvl,s"⋌"h$361:g$~KK*oPUA{n$PVu\귻R$$[pJ/܂oٞwt"?niH6LB'`|VBW)UscJ#K*ٖ0 Ia,#G Y ᩮ<o8洮S^2jv^ͧv""s ",42#PY7l*C=iŚjqy!6Ai\XK([>{xM[a(0c,uOOh2Hzwۧx8WeaI֩aGL?pJPfP[ԡK܊w:"`\noG^gzb@]աMc5< )"$GJWT' p)gSͿZTu/+^N۶ }}4g;9+(}nwf\HĔz‰8!ĥAAť3ڤW^h1wMz!RQZVCLN&{kĴmoI YQ  !BPѐQU5B!{XYej"'[Sl҆L:gotq PJ`֖Y+*5b?I|*O0ފ Ǥ~+5c. :vRڔA 峳G(œ LJ0ֵ$=HȜ1Pg5gLnJ`_Ɯr yHYШVdZ."G@,(g6SX!1)`+?:\mGb`GEaHUW]h:l-_V}( Fp*B= -HF =R`a3U}Fmۃ S#r ovO>|\UUrQ3₿ We_M@):m5cJMfVjz[mkqRQt&Tx.x6\d RW (UcӖd!T&҈T7RtEd,fnP{L4L~W[T,?` _@ƕ PFFԃ _ψ-^w{ kVV me\5qG4Q .p.gL190s4m+34s٣z4nzx%hRaFpefc?sU=+cA^e.UW(;;#BO$j_!j3f>K],ТɮH;]3\}7І/s !Z K8T"#'g'kSWJbJ{}y<+s-LGgع%\ڞ/s j~c犘>D =.?*?W|u;;y^B7>Z?mZ?>|O(7~KyG/aG ÃLp3>gy9ޔ]e9g마'C>:{/=_e wOߧ[- -(c%BqzgV03Ҽ{g(wOO |kaìDfz={ǪƩ,ķQ mcCul栳{݅oJkỌauV=}Ȫ@Tnw`Onrܶi 3t;j__%yn4_[iXl4azSpۛƵ:KE۫Y8&j"p+R'x*D@r_0Pṃ;=ߒt'Vi^auŜѩVKGzIz.cv}w]~+OC} y{$kbw4pF2E~%d6yaQra;?>LR3AǻYsO$P L_Nz]Lm!5@=j%'X-؇'ηw>GkgS8$nŦ"Cu Fu˂jkKĘpThW*q%%8'tGܼ%CpbxҰa`gf/n[u!q *DX2.f%yȹqJcR}$uBBiFk/ga%֩Nnrܫ|SТLeWݯ*_^e9kNߕČm `ziʷNBez^$T"cy @N\O4H^3$(3";i/CfsG|+ <2(ƀщcTUt&![17Cm[ӕ>}zޥ*x|+S;U!u"L&.P={hth q0Jho!ri܅˗.q8Deήd/h^(xkGibeƨ̎\*km%>t2rk|XƤ$>o@K𖑙#%(tWxζU"cJ`}P^!3:JŐࢴ r,L16l(S1*5ŪԵ9_iHPQQȰG%>(됺J;cU[~Tއ:FG (F(}bAGwrg}/uF)G-ʢmhxL귻RsWu/zE sU[D]#}zuW|}@Bb[d7 ^i{3 pQ2Lj&RHo~~ VeO3 mRh^{a"=9eMҿ~SDe* `{B|N2dXi ȼ3+"B=`ڟ&IMF̦y`9L ?*;E\}D" Rf(}JyCj4Qٲ^G $G׼ao;D;,IJm|nCl/ sqk0-e 4>!7' zu˰!,C-@fYuq΀gbVdϛF7ؤF  ܭl{p\Np nMqeYFuwQ1>{3  'aώ2ՍY0e[A q(s:zNF˛vVs;r{]`l˶fSks{}?6!~mAiʟ;;o7W9q (Pard*a#,tFŨ[N?mQA7wՇ2rK/`(oV|]}t{)y&|?* nrxczyű"Fȇ 51R)yeP-j!0)o%!#ZIp n09  +:|v:ؕX$UXK'*0RdUBZt5~"Qlg)v33UU݇:[~}z5t`ȪEA{GA*@V_RTʎGvge3E%Hсc!q-9n[*igM1iSQ{%V < Gؙ*c$g+ ax  #S'x` a&3J=2yY"[+)S$I`OM國 Sbz)sQje.]zZFYT,,T ;m_w*fY(b1g:fWХ3y^6怎C.~B/~Kr|8;*ryԳ%HV?'fptmղ6)ZIjſ?~J+WghB46=Ir;~ȡ9Ԯ6=& #mr$% X)N0jKvdOIlkյH ¿D)ETضm*&Hy|h ,϶'N:M*|sLܩ01ShB#R]Kl]kP9!A c]9[{~}9W _&QWooַ<_u?\}~n$o>oˋԁ/wHāy y;ˋ?6 S+αX$$ i?wH ?$ LW4 lW},>RciI1؅ ݳ.. ?f$-fq_\llM3l(~).? ]lT/]nRCwSUVǪISտyI% O'Zq ,Zyzr$QD"Qb*ъLS÷pLLs/?S>9JvΤxLYdqh Fpre9k0wF}731_|O[aA%q'%&R4+J򊇜&ոء-r|6Ǯ^Z핫lw()PI+k(@lg}I%+ =F–3+Mv~sa5qNL>(4t0ΠZAI̶^12gl$ڠ,#o)p1ˇ3>vnzK )c.3\ ,NVubRlP\gYvId"p(ae}}n/?}lW)1PL'k;B@-G 'w F:kFUuai5+.H W(f-nk@㌴ {kjTrߪΓ%=*L{HQ@!V{&HV{\_jy%0vt5s̊4!P{SHA{ ia cC/5"|!;u'EQ,x뻶_@wRD.U"'%^ .U҂= GTY.@%^b&V|MaDɮä:ycR%s1 <1ѭĢ;wp _ev>.]4NtפR5u(Pk!a}3Qt\Sd=QYz\g8H q9J;cbӱY^ϷZhNtƌB|D-)u\YgOk:hieK`3@7=ϗ L^R$& RbӨ^2)9yA)i_ I+L*AܻA3bsB/R~o~޽xDiݶ5Ry<\HΚ]7# &'ͭwR/?Qіl(L49vG ,2tg:XnbyL lx>%5GBh*n)LɞN7Y СZk#hlҀ2Q|OHoPˆ)&׼nd 5/ei6=' gSa$IKH3Av/,>b4eAOb ,ӗȣ0j*-g~ON0U3ww ??_%\?:wчޫ8z轊?zJҺ6wkRh`-;<1zٺ؄Y z#,buya/D譛 ]^/dE:Dh- =lĪ>^kõCېD5oܖ:ϽDJY־:a+;*`5+z:O-Kaa[/Kêdx{UU+`ǗC>G6}+d锾lJz◑E{>WRRdR9}aXL0(0,ľ}f%C@.YǑvK9a 1 ޝ>bO{~}x>tZs럫_5noϰS#:me1ϟ ۩ԑm(5΂#6dHRcTf7\uo|C}}\}p׽ZWJ TMLpڇ_f{k}yOyǶw:/F$LhJ$:D\`мn;t4'cI+a$ =`Ԍ8y1 噭 1nmsse4Lwqܳ^[LNHc Ji=5Tlu*۵Rb641HO;Mp-k/lZW>+0LL*l.(-)m&q8d=R, ^rNSVصZhaH4',WVl! %"V$<c_zUz1D+=)D~1g *ΰL&*6?vVӳ9ޠ ;CU "MG:}bH m~)mFyH[x\RABqOjn&T" SQ%`NJxc8Oʏڕ_iym@`W↛_R#mQxY~]M?oQ笃q`ouu!pHxap7:poI#on"m$ηDR0%q5FE×gzX:#UbZ"94V%:'v.U X* 0橀-`* %J:+FZ.9CCiX t>X X_ 45s&Hl+A/y>jC%gN <36K;^uf<̔KnΙFXQ3W0˝DXz ,sęEi眀 Arv Wԉ`^D0@IP\&&;!908˙|G0XRg.1u68*# .b(/*v0`~hz1?^ _9ql+C%M{W7׭Ǫ y87UqUWq@\ڻ IH!Ր XA)[=ilTsZD'먻!Zˋ2awmE,BsQ(uZ w6/>^k%k7G ~/,eâV=V$HRr=QXձʲoYSN;.V pSTw Xjеh9qeKbL!]}HW ٘0 1{,/<[8WaE#wyfAҠRe0*-xGߗ&CVۓIִ1u YKu\i+FP4 VصZa,C.Ro]mo#+| 4|)E! w 2wɗ,~ah [n~dK[rD-A3.>O,UR*/Nze|j21ݷ$wN^kF'& a^6Oe>_y|hwa\~ 폏w~ z)QPo$xa'Z;?=:fQEtRޓNZ7g;حs3h ~K#yy'HaM<"/ޑWtw fNGǨ&q0taȌRJ}:Y{caC4X2]7Pe7TzqRjBBň|?FbKsGߑV&2mSleϡ^,8$oRc$ަ)79JHL("Ց&i61Tᯙǘ.}ti`S(2q5)G !Iyp|iެKTz.` x*i[E|$0wn,W˲+YҼ4UR"ҠƲE%йZWU^%mAp*1'ˬ X$V((LdԀvBfyLHR}}0 ߛmKu(~o|M{S| Qr҂k.M|HJ!WD߽F) }>`xOnRR҂}{s p\'Y{fBn=qm;z9B]⇥ Wr$ミ)Dd]u\)i^8ȾNkLRfR"ޭľsCL'A.dS$tQaHgw#X*atJہxj ouӣn ZaK7mCjL< K?Xf>FT5@J/x b%-:e,"9 zp Q(2_nRTbu=}bS;b&%]0`W?d0 8J&=[קq!.p b> }߂^R$8r{W/4=%LRmh^[$VjTzt CNX "f$W]K{?1HIHU1rZɕ@54 APV'^#n|rm;>!e0ל5T{g4zD}c͟L ŹP΁:as>'e^R"47},j> 9L0؏x?7>%DCXJyl3z:ukk}kEu76BjK.dS0y,oXх)O`J]g1՝R5O.f^im[fWO?IXg>mh3>Y |Z! P\L0H(i]Lyx2& V U.yQc= &)64_?``8?X$D`k&3 - J2 2[(r}nWC (2٥4gElU@ S튮\ 3S'v_O9:?5J[y! 6N>@igTOlM[xds.s|lwA7s-WƎ+7[#Y23_UW+ƞ۟]۠6+,|upwc^oYmO]pʖWwfv}X}x'AЗ{Ԃ*￘j.ffvD-wmR{^7<)dݸmumɧy]n>{AF]L.\RΡdsʙZosog6̼m.l)T "5#m|:C!P^]9naQJZ5O(FSJyo$CWK"ä$ cuQNRxڍ'ؑMrJMXvw;j<[hqFozWhqm B[)h vScT 킟ҴXvz=cP0bvrw77h,%o$9qe=TKG(&ܐaJMI9LoMɰT\vZge3>7v{4c?7r(B4h'b&W_?sbdV{QwZܯLeqYu .\:'tNptͼTsyQ9KΕeNsSi^P*RRPt7\{>nj}r+qgCϵf vrfY7ݪl4>re,{h! @wo .3KGN @P%c%+b)sXrYТ, Q"9AjҾqt#@n(IpRCUi$ʨ@_zH T ORSTi` ETk2@hMd}|Op_PJ*IP115m/Z[ %P[oWO*~* sPTUԞzHk>dpx=q4]?wV:A=d’땝CyzCŶQ>TՏKZ S1̴- S˂h.5JlTicq)$ͥ)Ɉ UReBgYi=A8=?iFeT0;"r[n4\}8[9{2w]q6Fff?^`5Jlח9AoX7Sxcn jGЮ hm{mn3Dž )(_ڿfu{Δ_??WNBmи{gS\6ƴf̍Pӕ xR!0C^T07"ǹZmKn:Ko0/$dC'wuhR1KEVǴ9+~| T i+WZ:E;_M2 hb:uQźѡV!3VHYukBC^)Uh{5N Fu~u;[aX;n yFZ&4䕫:ؙnՂ-w#Jeovč,%!,5OyT#Oa90fb{!\Z1nlf_r4L.矾 DEF.߾8x< \T,`b" e1%8jA`$4ޅy'TQ3\B;T T Έ֛KaL~$0^ 3KvǠrD赪l@8ű0h$ @ AFV LJExR5,\J0饒 ̓%,@`]T+9ԚTR^gȾ) t*|- \4*,!F%cVKgΔ0ESkCJRaTbGRT%HC93Nf}gw}ƣow|D[rv` j#[6hDX(%ʷvX]9H1(n݂B"Hg[ %yge [lKy~w=|y&J!}<|TLUD7|cb[n =ȇa||ԥEa\x)*i קI /K~~j Lk]_9emF~:q6&ƾd'}́6t}FYq:w=xw͟~ʔڙ>eJ<TkFmF1[9W:'v,jZE D}Ή[~cxlen]t.C 쓿׬.<1]H5y#R?LSactVQ1Mj*5~UZk 0y5%*Im*pP1IVQĥh(yG^[bڮ""|ƘBżeQoYty,;-]m}!GRJbޓ_Mr?W9YZ*b9v"Ǔ MŸ"YARafuxPSpX>ݹe+<޼ށLwEݣ b]8:-nuH{!)a%mT˫SգH oJ?t]w&!s@xpYP6$%;Enp4'l)4RxC֨‰#w6 )7~ )O_}#o<$a_qyaaG1{ؘOed^8Ѱ:+"ڐ=C?uPra) {7mGu$" xK6΁n{8z077*݆l !٧sZ2%6og-,DL]f: 5=iYp!ݵjc}<3Am_G|yJ'r!$Fr/NZ`U`w*~%*;fퟣswf\^Zz݆ߗ㩷fFnC*,;_6Cqw+avnXU\Gj-*Un89Hw:yNQnƫ #qtgw쾊Z}&|~_*y&6ɜDew8k%8nVi$!R׃tes7Wj͕aRwsez7W$1ֺ*>`J&b(*Bާ1kBcJx >0=*us$wK9`A8cp5 n]R c߾Ͼpr% ۅok̸쬭 MS _mX =ahֶ+Pබ1Y>1_/zx[w5.YxJloW$W[t,n^mUa'6JV.ubYgt)=O:]uN's :w$T\j4ypX.]YO_ iP%b CR1E,U`b185I[g^dH@Nj;`@~m26?xOB CSP ?@z㫳`hFȧb e婧Ӕ;Ȋ>:1 .Yl|i \-M" tSFFP6%p/ooq󤬂?./gP"tUޕ&9?|O-"ekOέș{Јqa2;(^pM*__\~5rAݿ{E1e8c[꛲٘??cNBW-2> -%MYj% =Z8-%l ڰTM5’7UZ$)cX7Ñg͑::%J*‹=;$ hCcӲWURhHi`Jot m$ucMXXy#8x )E;zp@#PHjO1}֔$zhr3TȘʻ D 98 E,JFrS(|y1j0g}-ЎLSV }JQ[gkTNZzBlh,:+00#GD+1 yqunIhṠJd%}nqk%ETP;Fц! yvK*\tKr _w6Pg--8VK (IeRvܖ]^ 1uұ(r6]y޴pΎĝ|_/~Eіo0ᒾ'ր$"%!N 8Bpv< )tE v'aMO8񯲼je'? z./b3MWͯP/SRQ`:O? P_'TSPOA5?Ust2|觚sN&RL[9!Ę2b *mja yQp~#ޮVݬ^ }b '>˾Y֕K}m폗CEj`9-U ܪͫkA QG¨i)ձNjNU@gz:x!"_TD'$S(d}Ħ hPR$/HQ h]Na:`2`< 0kX DvcW C- x똳(Z5gg9YUv̩KU3zrn}I%J6׾\}%U}9?R*n2ҊaP8W2[H4ZZ~N+j_4*4ox|XU}Ѝ M/L]8XNwI}&u1A|9^8RK1, }?l) 6WyAu(ko%шiXa Q {1xIΐH  <]=J%4/]i\h$t@[q!,4.V*fe)¶V3)@D&7y4E +0'b]dΌfEW@ȕpY/|rrgx ҳ4pxZ{RQxoa !%19 䰶ޱmb,ͰK8Tc{ݔH)qKX*Nbnq(Z˵esCB%z  kS5"b\m-_  a~Y"P"7|f3. .{"az6_!Ե/ !;Ĉ&hp)ڄI_#VjbKTQJnq*wv&H HK{3KU ֞/,X u;f'~*o"0MlۮT҃sKu?:½<|򜄙S œr —Հ/c7RmPX5 LU}TXU TAbIxwUqd;Wb4RH,cFfUXl*N5ʅXNk&JO&^&)wEp,E `h.ܮcy `m<部jr5`شcsE=͏j@Km|yR>J?_j֫Եϗa\A^!$-EjA^$oRS wSW=AL'"#~a"b6oSϼ2B̤EO^Xo줞%,Wm…vS?jcM0#_z32 ݘ (Y;cU l1y+]hzQLg Y-ׅخw4zO'@EΏj2;DkHC^)KHܺq)\nu1QϨb}iNY֭ y*ZG$dY7֭.6SU[4édYs֭ y*Z) &" k\VG*Vxy P8*W&mQj*w١r2Y=$3omȌ C"-,۴/+@,``I$ݹvsGvRY{Mzn1[T>Z)w[^)Ջԁ{h"z)쒃ųCɻM0Gjܝl8;󭉵ohֺsєo=et]骁Ce{{ PDr5ݰ5myg6 {wk M^ҍ_tu%yl:c5[K7MW« 潖\N 3|&OZAXU#E>ϗZZrmbչY*;4!E%Y}F)yUԟXzU4!/\ET>]ctѺar$znu1QϨbwQCߝu_DҺ5!/\Ek;?=~fNxFu>uoc8e3VEm[Ul|3v{e6w8K-(33'i,Zxg6D`yHr-ys, ^tk:-}@MPہ~yZ_̸1Oo6wʅ5/U> cq|V-.N1Aa7ƞkLn-+^|p>Tuqlpxc_O/y}"6>nn kGx3]wW6Q"Xe:s l qZԥ^(ZvmdӥK\(PyMtN:ҡ.ZC"RT>6Dy"P2E`ER0.I2 =H&EW OQz4бo]RRU[j\.n0H k_oK+BogT~ejc#@`y7Ǭ `~MB̕UK*s`+T _inftoͻ֟$?gi^9-&i9=9E a5IPB3=J4BI /JeIkQ"X )!$%bKgƒfxhn 7#cɚln,g[-Vy1%xјᐱ@66j54Hr`=Anf%}΢7"[4Uf3_K 8yeY9 ^*r͕KTzE۫޼coT/Uoljɞ+lՠ$\*>ɧ%Ė{i~-m$:yXȫ5T<].Vō_<5$f#Ðm}xl]lo瀐S竄:MlSپ~!LHD.Xp=)T څt.Ft`a/i)QwLWu@@L8KQ,9GHĂ8OM'td)Q_̖y6CdRfCtJяvڤ5K밍'ZÞA2s8iΰq8ẏYP$q5L0\H'Mj;sXpI'a:& tފ hc9LۣS")itN!q>KF8r D'@ܙSB&}u(:rk17`vkPNdjqd"7cI53j|WU—B8+JEő0 %"" \ "|Èhy)%7QV4^0ǨnFW>@Fk3iZͿ]?CqXgk7"\s{?;B}"PgYAz|ˤ-J-H.s{| @5v}#CD;6 #HPľ " hDx>R/ܞ*D̃;y3F"hzgf`HQzO7~&HBۯoۧP2kӓwxRW3gK 6h*>Fwx7Nu=3MR/m='XNbh%7K'F܏̸_̋+m| *ޏ z4!^esmoMI K;UG!K(T&mQjPSG! 촔7eRg^`U޴32liԷE惖^Bll1Z Rsuek)bvZ[/02oR ;n6'ZjzH- F^Rb2Nh)%vZJ- l%k)VDJ kroRCDyek)dvZ icꄖBfԜa ;-Eh)SKˤ-Jv.[KVO 3'; c*OމK'2PA%W!QC!#BL(4( Oąϴ Y_S 6.l!9GEWVT` kPWi_Y7 DM8 f)ΦK@؉ʃ{`OR`\ְw3I$苅b]dX*E\B}iwONpk,8j$-ub.C OF "{`O~UQu-YU& 2JYހoF̠k?yQLFyNWFy{ݬ6Mبn9(sJ.u(Ӷ{t=2v:@ƫ׍V9Yl48lYAqr̪Uh4u~*^W@%ME%|)Uy_4ߦ2dƛotstfݭu|1SoNR۩vu^ܾ]v-!̦q/F woߞNb6[d':o>)!ɒ}ŲЕC2Iewyvo~CJR惲de3cJM]_ײ¤e5?x&VJeanf[T7˯B TdΚ#\A>!zy򓶅+o:6aa:5~ď|@mom'XP,dDW ɧr5 a#Yn.;ʸ)~V3oi_"!](.s;>S^T) 61V7h:۟nߦGwg)F IJq?ZFmrP{Q(|?>GA@`c p_K.<֤w;0P)er|a""@$$J chxD#E%C@bfR R qsE)%T/  "@T|< H@E %4 r ##3e*LOG@ĸ/ + X(RI( P+B8]|LMW3=^dq q)lβ"e`XLP^x@*JY X KvZn?c:wXqpS ׋DtzܯV&wt?bGx\qa3 QZa.7[ƿ~n*)^Y̴bY6^ݼzЫ}q+W}U7o?a)yxL\Yxw'GzrL&l^D,7~؛ݢ}x'jbE_kPԧmrkRKK5^O]֙7 ʽټУNaM>ݙ[4(~1qk\r҉E&W?s-ҫI8Rc0>Vx>'Lݙzn3.ͬb7?D^hQU uIu…/oxQgܸ"M6~ nڳAu${8F7ɶ#EvW֣ՒFlFb=ZWbYom>^?t;XƷݿaxw7eVt/׌09{5ϯ.e ~,v4X=vfQ׹$~7O M.3r˞(v=Oi]*|E~ax6_巓ɷ?A~ ?8=pr'w lՑec.^\e~(.q3w}yR=LjoBm->ϻFP1<-HrzB}?)JnmŸ\\7e2~V0vj.KW>8X ^DnHDD=BN (U6Q{@бbZ.Q)#,P_A^e |[Q,~߫0Ԁ :" р &JyjԯD%2%dYjg w"ε3e_vTVWK1қ6pV/oţ7޷u<^bqu[>59^.zHp F[rbyy%R[=MǾ\DDzIo`"|!7ngb_p.k¤&dbQpF9H< T !S wl R2.5;ͻ7]»aLlQ 0 x~zAeό?FhRij>ќ(QL>==DA\l":&5mtLTG$A2ы5ԯdllnԀݣ ]k5-_ށ_fɌga󰦡s0ќ}Uf$1Z+:p[UnAwnӻ|r&r/hI={&kN2ͤyEZzz޹epY?O7 3i&]ZM)j CY1ܹTFf NgɔPJɷ 󢱯~4/dͦ,(L\0ɰP'g iƔ'q:% L)00C$#ȕNd&T[A40?L끬 Ӗ)4x1/Q/ _W\.sQ*ϰJsF6s_-:ys5ZIcWwE/>5".1P,SiHŹ<[X)mCT.w3q;42g)Vf$i戥<ˌAq3Ma&s#j7}Zmoz{n%3BN[,)8:%?^N17gc%6\CD+GĥxT5xv'Յ@cQf5x>2щBY"H0$QČ I)F @Q`X*f Vc4 Ϟl%Hi2$VH2#3$ V(" O5&qV1 TǮc".>vK>65 Q Ll}2'#Sg%?lJ?j[=9Īfr_v^Yk0)# l-Dž1V~DCޯ / poO=l6|{WϩTZ?~*˅rC.k>pRoA89 Λ;a4K$:!>Q >{i_Y#Je >NkY7(,?rUBTP nhJ^QWLh$.K=pVT&H.$BM\?y5o%2'Ԑղ̲r1jy,IѩʆZdĈKxVcD&8I-JCc1,M ʬBV+Ro.&V$V!VЦZ4 T1B$ͰZGF֤X:1Y L[)Ea)bP%d^>  K+0vk 1G:@SG@F?e[T x8z?1о&)t-HoOb!I7sSOz(]h]XBi`^8Uta,jDmgWVieVgP q͎;6wP/XGcYK:VkX,0ch|vyO!{;ͧ?ݓVn 聱þ|cGXaLCa -F_VQ/ rMP{lٕDUOZai[`ΥЧՒ(G`zfbPV\t?"9SMu\3 ªqNݠV9@ ص js~\=4?靤ZrTz6^PB*;-VRzFJqZViK&!c 3K\ku|dJ4M4?Hp:5V^+~FʼiPQ|Kvtzs529='Եnfߏl;O.h 6Me i#X3_1(\lfF/Rǹ{T-_+M~d.>&w Yp8|K,TҐ/\EtT3ꤳ> nd 4(кb:Ϩc:2-mSM[hN)vv-&K1 [,!:֭Cnk"PYA֭ US,} !ik7s6=nm0 &J\sy) {kG{uމb)-mJ,|hN!㻰-]O=RQ"㤁.("esx'#Q?q_LVh`c*;&DXiUki>s*'s*qj}U91ơ@>{lj!b9ĎC99Yh"=x܂J l/p 4XH}୑o^懣$Ag@qe S6e JQ-KMOhƲҊrc@4X*@_#)򙘷>7㰘~N&i*wgVZ*nw|>Mfz/̯柬ߣR|[!{[F< 7/!z(ߦ*sXɊG&\<"V&:A[?W#e9&LaYLLfBze dCdbVǂۭVKI.q},7('x#^hynf1V~~9bMy!}tuom>5GL zxV1y-  >h%yaf`U$[K- qޒ2xxiJw^Ƞs_Jt 8&h-1w~FZJwv@ktC*YxvevNj:Z/~;d簌 10K-:@p,s\qyV4LW>Ch"]V\AԒ<⎪@JpW8ښW(\HՁZaD'lt֨O^GۡmSRZS< [qJ]ZJwk M99g#/ۖ Ï=Lښ lG]ՙ:;Z}of } \xbxc&jߑȆ 4X.ӇSxǷZ)ُjuy脔a>e쩗i~O ekhN1]nab:ϨcNwLIe-iukBCp)ܶnA>u;a2--mIK[hE ꦫ5?X/;kxԃCHSB]q7*||(綺AOx%XFIԛȪFwۅmO.Cj TѥCtbX$9)ҁ.0A,7pbhi}}!b#_^L>{g+}T~'Y,vX$sxBWJu yAfTʋvu!#IuȨDOA\ DGԔ7Y_Eb/$Y$XOͶHV$Yg$\ĞL46`[vX*b֤8AӲ-n/^f:4>$j9"\Ƥ_& Z hn T2e Ƌ$"SRs<Ǵb:G{7 J( "L$vQ}ޤZqOWG ~#бVs=@xF" 3z!royh43P sC7:>6wx]{+m@9;{@91ҝDLq/W=~&!HE1%\}Otabz`ݎ&fa40MFDu|%ֺ!o\E: r#b\yb[)X(XVrC,S$[EB}*X>e:cQṪ@vx8-Myp%⤥G\i)1MBKR1uң8T\PC8!.T1qc8N9R8-]\l1kb\;! b2d8dNi2Lp!X21W&/.xFF6PY:jdpNBHT$(H?sRRRRZf6j9CRVXX kXT#Cײm I(M ARyfr9tL2"< [.l#׷3vH9S+#T/9p^%#[/Ϳ@"7 6W6@kaSx"PhMB-!Tp&4䍫h'8zݺM~+bT;Xy_^Pܽ^2Һ!o\E;锘nLu[WĨNwnz2uN֭ y*ЩȂqnp z3Fu *]\@_ǣ} k= - ݊jM'-=r-$+}>:rqZZQmYKUU\պpZ&j'-=F-RIh.ϛTk餥GT}vS}ޤs<:n-EBvS}ޤZJ;ї8y|BҜqKs5MLQ<5<@KHVٱ9rW쿈j6ϗO5Nֽ]ŰGGőa aBd,̩˜ !ΝJd>ک [3)&4޶nt6uM<=L}d΂ɛxA&r& 5By)JV9 # VZ.5X*;Zkres Zh+kjV}F]V!ow]i9XqL ~Kښ?<_i?oҎi>\\g_  9{a]?ޒ܅Yܱ}~S@Zpiŵiz-fZru@S ux4[wI YbD?u` hЫ5k C}&A,WF!$IwgSfϳ.;AAvϢ0!7 5Lk֗SaJuW.g!]=WcC)/:-!nH%="0ԳNq\"B@ n,?%4l9XO>q2'ol~, ٿ\y?W/d|lw2z2 Z~ 3t J4}Ę%^Tj c,QnddK.`e㹒XXO[ҿ E&yn`RR~\5VTP\(^RfJ $LY)t;pU0kPᠶ2Y!q>1V|˧gbMǑT19;QM8g;W}&w};` I}fݻYyyw?_ 觇+oK6}E۳o!!Mi4ͮKArÄkR^nׯ~0Mx2FH?esA8JY=k'ﴻ{~_C 5Vj?ᜃ)?4ff{^y33_/؋ ϟ.;'6Kݼ B@h̻5Ņa e~%_׎PX۟0؛~NLk}Om+L)L%R&Q2Yd;jNlNlL2|sbtIBR<C'Bnli1, ņI5~'XRɁ3E'BAJ?ktϱn29r_F=0*x^ҺN|o$T< """.&!`"WxQL.2rTe%D q@\a)g>ܝ _gEkr]uEgp~ 3 (ۼZm^s\{i`[ 4d.ʫ5 L\I x+ Ogr8DIoQrlٖl%+3 &ͯ{? FB۽mIM"'Xt<;@-z*JB1*x5zhN}Hh#uW(IKm9;$zh%7hXFKYM5# Nrd xhG1-a#X*l<B'iѤ&AV{ _7]& f<*#~~$ѾWDsO_?}Y~\^dWz^cQ'OWӇtVqSȺ[k b,܇Ƞv5|?kx?n~[珿?ms+/Ię`c^e }#IAm<[aV2{J!oLZ]C J؛\.( >mU߯%}J*\Ql}-zҵV@,%J&y(mh7FTR. H\'Q#Mt{7%݋~~;=++{Prm8t90m =N9ƹڑp j(p(}g"kU`>EZWL 6.tQ*0#T[9.4s⛻*'ށ(io({}fo0x=QszN!ȕGThG=Sf/DKvF'7F-wl ؑȊowWaJ2x!q:uZ.X,j_~(ĕuv٠ǩO_ wkhw& N|:! Dhv 5xDno0#zjTFXϛ 885X8aӋZ̄@mĊ8*C[K7J^x}cV`Q{aIp;[|KH@&LJV2<m*k% 8OZ>!xZ6f^r_'1~ԂNv7sV&@e!cbՐ1LܳعZ)c)cF3,eb.*_0>U0K%d]8zZ Qo= O?[Gw5[]Z4vh"kj$w<#tQU" H*$Jpy;(N8?~x7-֧grh `Z@tfTDe%c6(ܑ9جD]"whv=dWyU&nۯr!:fzd)"%D.^ЛR#EٲH|n _ zI&ɔz n]KQ+)]-/iRbA˾*%ْɚl_5$#x1l#0;vy;iv{5b :7魛^N䌒)O~b|aB\$Bq{H9k1k@Φ*fbLN."M61h`;GZId3RdxQT!5BAS DOcIz4`X+R 60(-pPT!CtHj{XOLbObDgx.\4IȐqyFփHJH ӻa#0q2E{:e1/Vmvsv9Rd2eGE81-$_ =y欀N9:2s?7v "{zn.upN>Z$ֵ;RfN->0n4Ѕ Wz yd`t}7S;LD<'yYpNj2ݣy,Ns@;x= C)Y朰H"ɖBۍLvV|ٗPFá˫ɾV-Nb%̸NWj˾k!lS:#;D<`~u>+&1 xٗ5 u6-WX̂% ϾPm?~h@Z<0`D4/*D2'1` ", FUH)/ (>3vL(V c-%` ,٪XûMA)QMqqJ,j4* =ĺђy dd,#ykvhhI"0]gTNCٽ0rĸ*(N/X0E)HY Zr60~P_\wU5:1=MtdyȲ(#sYt )?~/6DhWtL<ęT|o&(ab#2t< ͒-KsHzIB(1ה N@5.O\Ҟ\ =7/e#D9*a8}IZCnxS"``Ǜnn'1pG#hӒ<o KId>q}IT#V() iQdoGxh)r>wwD_<-vîB%d%3Os,Z8p: .9㣧BgAZV\\S%V႞29٬ɸ N< ڍK5R#ny7ׂ@ŗj"9.%sa 2%DoFvVm(v`M!\.VM0 ^{Nհve^]÷=ʗCN N)}(>t={H cFO 9iڭg*;<f83"3С7"jZ[RZLPE&h0m}bD[V^ &PyiྔAnӹo+Z}s5!+af>tPkiXIen,Xnl`iq}պT|ȊY i8"EPո}4A#wy>c Y<;(%| 91.5殽fAp@cTpn# y 9z5Zwzy &gkkZީY&5(Ư  j`Ц NvB>,(PM gIii;¢pb5pA0Rh~&.X&r(0_JTJ+ z/NOTŸZEQ6E"9^<3P41^_]lC8 "},Eˆ ?)Xi1 MhVQw/l@T:Nf+/o{6ZϽhTm1؆gFLwٙto|z7ݪP1MEK $AkXRSh^(ۦUT[~-6Jݑ<>#zk?QPփ|]~N3R oz)3Gs]Redi,Ɲ"F*43>JD5qπʌZ:nLsJx(|U6R~ bj#z0mK>'>Љ&rE`CY(BiXVQ}DeA:ZJ~m]~dT@ǵm d4ɀ"ڣ֔0ݯ P`bO&0tKVEF'()YD%0F\)L`rǫoo&@cFŋ~TroqUxzcW레ā-mޕRD%C\NBH]GXr/-spml2`h*KAqʎ[]aRQ? +"ȇ@yrQ/YFC2v 裩 |o;p6 vJ$GgT Nzd>XJqKkkB+ЯlGMCJlt:\LH'6%)Gz9;C$vF*r͙sҥ*5>kI a]@ٛކK4lCgB-KeNu]Wh8h;!KPVİ,![{brޚEYC?"/@f\,G'N@[z()QV . ݙu%ޭ}P\~ΧK\),'4ZRƗ+g齪VM]4eOFgh[>@/к#lE@ 8Eȱ3FXhrWlWqihC_ReR\H]؛ڻ}ͻYq`@Q01@Oޟ N9J,Ge2_":]QR@DzAܑZ!zr`3)Cɹ㛓h)է%I)[#9EV 40<hzgи4(qnL5Ĝ:9tQn`%/:\}0J .xy2PJ O+}}o0/ύhA9I$>eN<I cgK]R{3Ѳ[7]&W%x _BZj1„$h1f\"a]g5_* - x)Q1M2? )p!/}@KǦw%R]vmrPxb[CUсBh9kh.S7BpKWߙF06mfn&3m-9(#:C%e))?~xͤ ISѧ#M[b( 25<#lhi9wN(%)FZGl76.!n݄ӗ|PZ՞'"A@O.RZI>*! 0 /Vt`Cϯ]X.)8U F :릍Я?N5Yh˳ XeҘ9MPţ tǛڈZf% fiMp=WE;aBFo{^kC|R@Nb&A" L@;vzr.(ۆKO`$Y=hׯz,GhUU=;K0^% v~\"zd&"EoV^YK%M @)v+ A`H-@i£Wvk.EdƷIr 'P^y*2]k3>![vT(ѝ"ON{s/uBE2qzt՛N҆lkUlz?Ti`R7冭ݼwˍK3;tˁ[I+zUhQRD#=1sfjBy.`SFjr)E!ox#Ϩ \Na4z&ؤXCL>J<)ZLQK&J8vVg1 TyXxFK#**ZU6eUQc܃40,4~@p*[GAIΉBs|UٕoQ먴&])b IuK4X:)6* R)BKJ*)ˊ. yDm)#EyWdlj%$B,K9_2Yda}I |RJ7J#PǬ :4fmh%Em#M]qq>9t=§Apv]pߖm4ce>݋Kt/Ud]7o^_$闰g,Uo9znG(PNf(j|bUI ߩ*_MH)`x3FnH2kJ龰Ғpԛh*6WɅ4ϔKUG*惮ȻqВ%-ޱ!UC3X3)Sw;pVL{(gШWǽ`>=1Mqy{XxoDR>/TD=Ul]}TvO1֛Z1Kh[@YA%_`7 %+ `.)*~ՆڈFHVϮ-Eo?^M6_w],l?SXV(pZҵه^?~*{=~-Gaˍ.-ўV^msq4q*rRZ;;wG}˻{r.ޝӯH@VspsTn2V,dd]q kĻw)mM'߫J[ILPcznIC9w $g@nw%zgm?~k #Dט5IX'Zzڿ󷿾6%;ugd=Pߛ/fm-#ʽƫn]<{^9FK\^C jzuϲU 7w^"TbĚU5 JK; ABg,g^(iW/54_N-+F;Ek >MF8^=x&gN&VUG?i< }er`60 w8g"5A DGX!5 L"Э5w=g3rfd'WAg$)aO> 2ET|CMl4(.3WɱB2u ]7]YoI+^Eyv0דBeMQv7/eR*Veqq*6)ՀJ[!lk ܞw,"8^Kl ZQx͚(!~ARŧS挣"0!e x@ωO7 SE4D321w*'iSMDp:BʋR('eIQ ;9$39||JėŠ4+ 0RqGJVyf2I! Ҕt$DWVHI Aw]F:Ə,fmZJUeoBj-ƤR քJ醍{'b] ۂ&h> + b,V7žGHZ$AL]SĪ.d{Zwð[XVJNpFX B!8wɬ@ UxU3Ϲqn֤m8{5YyƓ F dAXk'Q@ܷ1S{4Bబ rSӑdBi W6;|lAf#FS^*g 8l-BV}h 0gl)N{iZl? )PJ%\˒* TRMJxF)(ϐ G=;ÎY{CcVՀhcRkrN%e0*h-N ցjJRRZ°jAUiP$ $`J墛FtH 5Rhꅠ{OFiAꨛ& n$9r̡O'p|S :Fõ cR C#J82lR`i)"0xp埿9;lN h!%` eYi]۰ga0SXtIӁ]gsΆcNVe>)6 CF%T6ESk|_#>~_v8}}ӭ]`-r֌.p}s>/#TW73_ꬋ~5oK nroh,r@k<濶J*Pu#uxѭ{d4FwWuQ@KN1Qr".x?^u]Ӂ)nAUD%bjˋˈ'ݤ+.퐻qVXc"Va`YS|o`)-n`:!jFn֎*3̱՛Ӝ=D.2I !=Gv'fIjZXphavI:_?:hXmɐXiXlD jHhXqnؖWePpj" ?6"WUqͬ&IԞe_3ɫGv>hPz:L :(ݧː$+},|..x>I·W .ˋp=J*g\y˕WϾ3ܦ#"ѧd25$YPčxu W9:ŇEU, u ;&uˀA,M%J +YaƂ."28`)JΏĪ!"3>.*k%lkyV+ d>ִvS[ۓO!2*FB|P˴tMR9Vb.b,8>ljvmr6 Q~JkL7Cp|;5>qՋгV/SU×dpM%/OY^}\=6-G*vG@٣4vi6UJE.Ф0 h&!)8YA%V  _IXImW,%7E P ׽6v J$h}ߪp+PeJ.HE7SA">*sf&_7)KݤiAh|ßCs)wȢhgMN6伙VPڡhVфV,RNcLVg[̲A6%^OEXfcnb uYH~zajåX7s󔕣]w6nD˚`]|Q[GNhrY}Yy3G:NW"ãI//\{0):6Mz^0Rr;ؼXth&atHjpDMkR#K6R2wi'V҃ -A9(-HA{j蓖\)i4@BS0m?o⿚0րx>]{_l->8E*ࢍcTeL1rz<6RSemQ{5φ7MX.xw].v7GW{_z"&¹U}Gc5SUsy5xvս#zt{&>![ ëz76Ùg*[ѫwUAmHּC~PP13X: NvMlwS24ֽܧ.ӗDzt1x ĉLĐkj@})~U, H-6hILIE@NBMLYutX/'7Ob""׋i/7 @՜?\^ad$Y۝1p4tU7UD.Mjd0zjMgǺSo.goxõh0̴5Jś4GZ6;cCmwpA8SLN\1&L(w7_pnl׏uֽX˖|J46TP>?jr>M ^DULFV5pzΊ`SKF [ו8C{~bIrś`Ƙ:yOʼ /YC,(ZJDr^?e$N?8F0 ?FѢkhZs'eۇN|]s6WT~ٻhhW!gn+r)%˖,Hl?#| pH{,#o繧A[zɋ8KWq)/͋Aۏ"Q|*ŸHŎEQ;X@( @VQΪlNIaK*%D*e*1;nZ@`K~.jm]Y27Pm% sD"P$ʓ۩>]:=:R)P*,dlL:^yk:}ǩNۉ=]&>tѶJHܶK%2[XU j)JV; kTҺ9L'1:seuu6'`T/zʙP Y}O\U2 *(jGpD׺aBln?__RԆw<uiIORѺBj qtԍ0bԚW|~Hl;(9@XJcox&{.\;D MSܤ`% 5&"酙a$OX-T]3;UP\"SXVE,u*tN M$]EЮo*{?%Gg =Apx-la@\pt$ -B]dD1 bd 1Z)ȈvoC'y%μ}!QNZ]>|ju("xbfV+vR(U u% ֥ ]=* \BP!&,uN)F=U}`ۛfI3# #$27Q}B537;s#Q< d s+*m (M_mm{g !0\:@[~}vewrnY"*]7aؾ!Mcߐ{ d-[hX3CY i,cb&ϻ"oBqH{M!(YwCjj2Ef~ !q)5efIu sB-YFy% )kJV+uuWe@6whI,+6;woyVÍzߤ%c%ܖdY Ar+d/Qm"&4bw$ \G7 z||q{Vt헋o3%! 1RCv<ă&AxBp]b9< b@{B╱y$jʯr@[=ßw8("q3l,;pb룒tuYӤ:h`}FNP䦝U$ݰ(nD5x@ݴšfm6ۘ.]%: pvѭz2ziI Jl4XI[# >\y~M !]ղzyss7a8 }P x2; Վd/Q n:Ʃv]NT'N5HqjٮoKv@;,eA$*˧ lw;TѪ_`=} hPM~^nZ#ZT@']ۈޗש[zVtu!8DS0eO\nJIn2N;X1wJvn"[!3dgD_S%4T$݋$0!!u+k=W̽goreQ4r~4k-U̯ w ;+,(hr. J&& eSrzLb:_ƨ3$g↯&1u9DǢT|5J^0]a{(p7#6:f1.q#nK܈~ x-Z{ƠcB0٧ ؈5CJRW`u:GZ o&#@eR5fOgW;_ܦR}XjA_}rߔ!;ų+о8n?D?Ak֠ EbfHUdde:ϊzuB(! Kn]Jⷛ/sC+PsQ a%¯\gq.ɮ<=QA, 7O˜,⑇)~PF6ړ6_n_W<(G25Ѫ<h]xcZ3 %'KԵ> WGSp7xѳˋz1n7||||VuV٪5JHlaQTc^pnrP Va [(,r#/M+竓E~u(3T^~>.?`B|^./D"e(?>ӈ1g\9ӈ!jg6%YpG?ͻն1OXۼ3@yH*(gZL"g@}'YV1rvUL{=ZS?56~@V@)!,XҖ)kJ+k{\ h̅"ZT&#0!ͅ4TH3Y&OE~+~+0W!IzDN#rEF˜kmDQ; yQ8RDhQi8LL(ԟ(}lxa*-Id_ =" m(!t+cGQ9xDSZІ= %4~{%%%%.$U Tlb]I䥳%k ,C5V$$6LPꋒ|g|'AS& 5u²ab 38cocb]bqխ䔌(i<_ȍ\8=26WU-@RɋJKΕ0$=|u%qFE$w*͂IXRفV,aY`inO\0?uwwSDnN$`m4`:@̢廐jAǛu1p:@'\wv`=4ån/z2JרՆ{x0ePvOvD54x.za1oC5aE]Ԥ{N[% A2oSp#85~$PYRD)Ϭ}try}x![b gIEx >_]?<]45%}|n\QIQH\普lmrQe/̿bA9 yx fL[g7a;gPv3# W'ڰȴ0TQ[4/7?0}4Ccy$ =cI9fpS~b+́LWJ+Y#CzմM'k*0`:u>t+$\4 |Lf7k"Զeio[VPZ$v'u6^CXAs@ޘWx)o'sm:"4pbp@)^L/'jE8Bsѡ"pF67u[ A[wCna[fA;w_ չ`ȋX]zv֛"@:QCINE@LAI%Τ\[ d6YCn.6-rviG:5,E¦:C~:SfBQ" ll ܑVma.u )[ˍ|E}6&q ΁X|.W26ǫ-t) Jd %%a3`mhv';>ڌ{CgUv􊽹ߌ7@gL{zDlTmP|l=;j:9^}cPb6p9M-ގ7]<2H nN1\kE8n䒬S \a! cT'Q { ԊORPPbZgD:W\hY-nv RC.1j-DNxc.A.V[SWfi_rы/xy=KZEeE3dΡI jէ;;BN(ӌDu=:u.]儅S Tbeg Y"b5t/|,ǔp]uދ/RJ1u_Fb`ӵcGį-gs0ެy> ,S/w֪k?ًICTUÅ}4LG1i\0i>n7s{eu댩J[Jjn;qTfg£9 !F|1wc|Ep= T q zUW?[!"w]E.X!N2{s}QV 7 3T{YNslj6~sMDpT< VϿn ٗǓ8DRnljeUt$pE.V!ȣqfOzrtOǞԨ_Nn >nl*Z_|RLGUJ,Z 2 z>hq:Z?ΈbpikqB@kۍ_>́?IGG\|L ΃C*H[N*9w<` t;ȢUVkW!i1F%^(ĸѻda:2e 08'"gD"YHMrl5?FxN W WѢr`7qY`sh_Z}p'"=4~`ϭ `Ot$rlǞw: 6(ijK(hmOkq,`["E g-՟pXY`۸D\B{cN؎1GOxeչf Uk,Bn)GKw7?0<3n~ G]Li}dmjH޻.FAŹT 1QȶokOLj <'|Nwgx>07gHr(D0(VmSU*u][Ss?7?jS "X[Zx[WxȐKgD/@ɑZ箖8-a EZ:1,PlU䶚1|P'=̗:T xe\.:H=KWGx,]hpq{L x0,]5OFVJsmB`G{|o'UU@" ^B'![-L3bl Vt5SF/}eoS~$`l!hX YL QFm0SUBRdOe4& 1:gJ|KY7}[% }[D"T'f/kϢh*_\sMV-=S/s,v}m3o݌Orh9ڎ|hd-)YsP `kv![dYdY##3w'l||^x-0&a]oR/1~ ?dG'`qu>vo|`-7Nd-4DspEp{Ir@޽W`:0]:X{Y\j$V% _\pJB,ukz 2^o7G\jY7yUxC֯lvb9^{eq_!v67QsŲzN%LY:.X?UjO`jmhMߚEgTHwsī"K֞G%}71U{ownQIW)'ǕIFr2]I5Ԕe5L왼Tj62DK8`Mu.|tVu~,GFzq̠r.ce &H G, b?`5jl(YȔ:e㨰1~>n<kZ 2ŒJPq=U!i\D|Q ',Wv? BNGZ, _k!5 /|WC}8b>=8q$(ɼdgjR܈ɩxˇ~"ٷدl2IUS0h<يxx S!2PNAU-L/BWd8 p} :{Q (bA$|V-/;'̲۷f;D\6- iMZt~?\V S}} ¨x՘1yh{)N[/rU9QRE/BlNq -Ch]b>S $" {fZ }K'M> MZVx }*ZK!R'b'',2nc2CIV4ɪZ珹x Xѩ' C㓏05sNɼzcԘ(,UPO~ k0@FOHG{ l)Ȳ>MO*hpW5< ?ޮ$ZT^"h)F!ե[Мo!ޭ?0~%cJ I.7"r Ǫ_:͋OZ yI0Hₐ[[з=E­,U/8Y~0X9_.@}T`\Ial&.'niNgs  Q=Ai1w߄ 3ATZ`CWVH^,[P5jW/~^n§ DžS^@|AX%k6Ѹ7M$%P*Z>Xӡ;RlNq^.hr"la ('dL-,k_jz-u2rp^˽pBx2 U{@)5hbKD@<ݏ8[/CZ~e3=#5IbyaE~rϯt] #J,",YrB".59_.4/@LC,N"lX2!f\Ka7t>I9]oǒW}C,asI/SE?d<;El4\OnFʁo nn MpƘE*2%VpF|mIC xp|K .4s*~ - !ʑ R;rF!Fy`ףZ|glmSX}q]_>RuW0/~[P{6_\-!V\HΣTIjZPц8P<ڊ E;*fu%8z2r饶ˉ~?T"N ΋EՍ1"=^_zT[ԒHIMҦVѥfvOr:c\V9VZda;y 0&:pt@A/GC\s? 6DŭF/6ܢ 瓁7@>!D\ds dgL-3yp奀un5C:\`΍(oJV܁ކilH8ش:<`m- \0&0gtր3XF/\ԗ\ͭ26mkh!02-K l4Vcv@&'VKv9V6/հ`XQuAb(HS\C6#~N"8v1DPWm#pMk:Z0J!0Ƣ/<С<`SIRT8n\&.SRU<|6_#,ՐmHuQ̭0[3Z ` }L̢)cGlJ7 a*49Tt┑ct-1 5?X#dtZҌmI‰8i֕BT<5v> l׺T1E&Œ7>[R>x&iٲ#-r8Eym#K*cTs,:^=Ցs4A5trG>;yt> 2c]_1ܲ95!BR VzՄ8o ^ t]?%2,C- ǠL5C8Ơ]~@k] yG0%8t9ʵ# \W9JK5 [ht#BPZc=sB9ݐ2"}/q󕈲,{>g.K(4E*&vQ3 h"NˋtZW;oPIGўD\.U opv7/h6b Z˲W7?^9{\-NSbbVٔEf3G1{2M=ގSA>ڹS2e]#Hٷ3sR3UR36@jH02PDEЩuF*NJ[lXl`+t6Kq ɭ.FϼVAHc$ !RU 7Єa^ч)V#ֱ R"~Q@ Uҝya|0צ({2y]y_ڸޛb1xZ-52L 򻄀>Fh_.RNz'u1ZwdkhR֔FZNyR)W2kOfDBLMyVS/TJSRT-k9#wZ֖[<7XKbo9Y Z@bR?3R,'UZT(Od-O>*[nfӸ[6Bfv"᮱v\wh<`&m܆gBɝ< 7qM!' L fU7Ch;ݪK }Z8VߔOBwyQ45+Gߦρ%C\M[.u[ niOhji;诡p]QT!Z)KFw."&KiMj"( JÄiF%Q+P6Lۯc |kL, .1 qf=g TpFb pGEGT`I, 6PjܖǪUPY-Ggg]LejfR ۙ+Xޟz{ o} 7[V}ц O ukHN% ̽&8gس\"u{dkP«jniU}*\ݺg`_y8fZ12~Q #-;J)^)g%$O?Ox;FeLˉg?`b 뙄l: ;tdVƪ:߾9׏=K93rEF:pkrt6E0-_-'~z)Gbf&?UoɭXfQ]/ϷчeHbt>ۼ?l)bJnBJJQ0Bs#e~ߕr򮷏ru^\遏,Oleey *n,|O*|=6yW)986uϣs?SGv2'0-dӺEi ZYX/gŷdCalJ|⍠e}JβhV ^DKeהH-88l ? }%rpYQhkiE 9#A)Z&%6uוjDGL3%^3PyeYh2,l86gOt`F-]_*BM%Y~'<-J5y}v}#M(U=yP~*m ""T} U4J1_fYqOi@@l?'L"{7.x|$(J˦kZLY خ#)<~\YZi+/yC]'\5HR3!e5kqT} tw#](  oufGJ~ʎ>8h_P^6^03RgkMGxj֩=![;!kzIɜT,7#Q;]yH*- Nt5$)o'rNy}2]hn oͶG>3>rM0T= flՑ.[hц̂鎏J1= Cgюd(r;b,YSkgL]}u?{3^& }ϑÌpen \Yq:}sq0*rTSLmm㽼ܾѹ=ǟBb.>]0cm|X}b*4pq!Y\GZDy7DH&“:, Ss݅> Qir Q9~㩇5G^k2i3qXβi1(N0KaִZ"ldWJy`mђ5>RWbV32S!|ΔALFDxq Qc4c,2#UZA)wG1Q&j-Qp'J}6!•LrρȵE~r}6mזׂZUj4:z[6jj).iׅ)y?anVwj{ae1"Bgjд^397@ióO{ֻD cUJujf0Ƚ20 z% |zWBS9@T,H#mg)>׏l cF[aK?`C h,kiPb<`XR7f& 4F-8 weDXiZ5BN0P`\C^^p$P%z[hFlA飭¼pU]?Q{q$Lfwy/'buF[gk]\7FGyUJ^ NfфJ/z|+;.}tA1V?0R2Axk)"{8V*q겞h=vN]8 Eeo1f,]N"JvQۻHrW zتAm?x ƒ!d2[BIb1dK"y%u)3E0  #x@teA3WGhe\ } /҅ݛD!5i'I,5 ͅA#-n\ii#SlY\JJUbf79ȓ7L6ő!OtJn|e&Dt8P@s9TV "\-7er"B)݇p@%'wU N x[4CЭiB[1Wqri7Fn%\wR[CMPY\e]VPBgxvj_axp,Xonqc=Sfz:2j U1*:׭(!b$kE3#&wxAΏ.#9~h7*}y 5ƑڐV1h+8FG WaM;'D( 1gԪM*LaV:dM59 K8vm&wW_;d71tk&sۚ>&/@YTUi[YVZۢ@4Ra("1+m` ZV1qRE+Ik PEUʛ`@`P ka(#1+I\Iʗe/K@ckeD+i8J!:r7 ^HTɺ[ H+[E+Vxư204{& 68Gq$8\!ƒcsL$ ,P)Cm\ԊK!8fY G+F0# ި?ά8RRGh=P4UqY^-v̩/ ~{dg {}GD1x0{!0Gf`QbY(K`ZRiYY0m8{9 Wk7Ww]`NRёw u$ u4U gT_''U7|0|̕.0(L^l.iɑkuN@JCsܤShENqݑj*9W.4& uY+GO٢1No[{Ք?,yL깽[Ԯy7m޹v.X榑WL8StXUҕTI`kk4!84*a%-gNy 0a }NK=hUbw֓;aT,%|.ro\Ni0MzQކ@TJY, Y&N!ښ3 d%Li"Pr#a+&Z実8%|^a4O~~ngt(38Ғ ǘ֖@T~!"|Ext'<'M8p}o^/uEZjg鑵? OyX )*&lB8pk+9UZH)xJxCr0R RJ=BIxnكpA J='@ZpAB穕7'#J=6D{+DPh  ګGGZ_7|3.3:C|3ZxVZ dlB?zBGD44PS^JF\rrl[8亼|G._U]}_֗~o`CNqMly}<NvO\)iC6^颺(.zӇ@ ,֮ LU,[=w40⃐=EJ%>>QKҔHKRHWj{{O~a3ju#7霭(;>Mq6z)* "o]]|$jovi(Ro`2C]l0_6v2cS hQ[/_=1GȥΚ'*! Z}rOs }{gV59NurvgwHr0V$(;9AO:Ʃ;x̑ͭ1\$8׎;/GÛ1h11r)rxys8)ߕu~C1=\|z4nOD}xhHӣݝvtztѵxXң q2S$L]Xb$ %x9Q#QJMT=ApqXp8&鸲7G뚒 0G{"~~QTz~<0ZYj6HZ\RRFQ(Q., i- K(;ۓZ<z#o>z}mq|píןOvC+}^ퟃ{Q܍۫R$0O̗l -#Ht^]G>? :մuQ*L& 6ݖuس=<,͑_??X/͕,䝛hMi65z߻x\ bL')moۼC{!ޭ y&eSfqy7-¶ [.16.B6ynޭ y&٦L)E՟B>!Dshn1O82xIE")&cy7UDav> {#,2i[$R[BV_&'n[UG>;=flSTl-P#Γ>UL>K~JqtMd4j҇J%,*y`2ʋ{[3~9'4ÞϹf^lV!GCc&XUH;yIi1f$z%^rWΘp5w;JUf엻oF^-6RfSZEđMrqRԑwuͥ];}>us%U7ت%{|؃T8~omfV0e`@V_>0Gq)[vS_./)A)杉8Xm[x]mmvPonY% g!Zxpqqd7rly/'tr6Z7__uNK_~c7v#rg}!yIWx+s* gWrpA*߫uVKg NMʑ(bkUJc+*Ѡ!#4$& R9S옷CajRWF+o5>ۓfZvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005020736215146434224017707 0ustar rootrootFeb 21 21:46:24 crc systemd[1]: Starting Kubernetes Kubelet... Feb 21 21:46:24 crc restorecon[4685]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:24 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 21 21:46:25 crc restorecon[4685]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 21 21:46:25 crc kubenswrapper[4717]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 21:46:25 crc kubenswrapper[4717]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 21 21:46:25 crc kubenswrapper[4717]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 21:46:25 crc kubenswrapper[4717]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 21:46:25 crc kubenswrapper[4717]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 21 21:46:25 crc kubenswrapper[4717]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.687286 4717 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699391 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699465 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699476 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699487 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699497 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699508 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699518 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699528 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699537 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699545 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699554 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699561 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699570 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699578 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699585 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699593 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699601 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699609 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699617 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699625 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699632 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699640 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699648 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699655 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699663 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699671 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699690 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699701 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699715 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699732 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699743 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699755 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699765 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699775 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699786 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699795 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699806 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699816 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699827 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699839 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699850 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699898 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699912 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699921 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699929 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699937 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699947 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699957 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699967 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699977 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699987 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.699998 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700008 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700018 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700028 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700040 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700050 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700062 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700077 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700088 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700098 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700109 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700119 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700148 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700159 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700169 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700179 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700189 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700199 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700210 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.700219 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701511 4717 flags.go:64] FLAG: --address="0.0.0.0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701549 4717 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701571 4717 flags.go:64] FLAG: --anonymous-auth="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701586 4717 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701603 4717 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701615 4717 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701633 4717 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701647 4717 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701657 4717 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701667 4717 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701677 4717 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701690 4717 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701701 4717 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701710 4717 flags.go:64] FLAG: --cgroup-root="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701719 4717 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701728 4717 flags.go:64] FLAG: --client-ca-file="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701737 4717 flags.go:64] FLAG: --cloud-config="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701746 4717 flags.go:64] FLAG: --cloud-provider="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701755 4717 flags.go:64] FLAG: --cluster-dns="[]" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701769 4717 flags.go:64] FLAG: --cluster-domain="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701778 4717 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701787 4717 flags.go:64] FLAG: --config-dir="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701796 4717 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701806 4717 flags.go:64] FLAG: --container-log-max-files="5" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701819 4717 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701828 4717 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701838 4717 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701848 4717 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701896 4717 flags.go:64] FLAG: --contention-profiling="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701906 4717 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701915 4717 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701925 4717 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701934 4717 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701946 4717 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701955 4717 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701964 4717 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701973 4717 flags.go:64] FLAG: --enable-load-reader="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701982 4717 flags.go:64] FLAG: --enable-server="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.701991 4717 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702004 4717 flags.go:64] FLAG: --event-burst="100" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702013 4717 flags.go:64] FLAG: --event-qps="50" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702023 4717 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702033 4717 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702043 4717 flags.go:64] FLAG: --eviction-hard="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702056 4717 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702065 4717 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702076 4717 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702088 4717 flags.go:64] FLAG: --eviction-soft="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702097 4717 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702107 4717 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702116 4717 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702127 4717 flags.go:64] FLAG: --experimental-mounter-path="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702138 4717 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702150 4717 flags.go:64] FLAG: --fail-swap-on="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702162 4717 flags.go:64] FLAG: --feature-gates="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702177 4717 flags.go:64] FLAG: --file-check-frequency="20s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702189 4717 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702201 4717 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702213 4717 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702229 4717 flags.go:64] FLAG: --healthz-port="10248" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702241 4717 flags.go:64] FLAG: --help="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702252 4717 flags.go:64] FLAG: --hostname-override="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702264 4717 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702275 4717 flags.go:64] FLAG: --http-check-frequency="20s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702285 4717 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702295 4717 flags.go:64] FLAG: --image-credential-provider-config="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702304 4717 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702317 4717 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702326 4717 flags.go:64] FLAG: --image-service-endpoint="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702335 4717 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702345 4717 flags.go:64] FLAG: --kube-api-burst="100" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702354 4717 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702364 4717 flags.go:64] FLAG: --kube-api-qps="50" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702373 4717 flags.go:64] FLAG: --kube-reserved="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702382 4717 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702391 4717 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702400 4717 flags.go:64] FLAG: --kubelet-cgroups="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702409 4717 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702419 4717 flags.go:64] FLAG: --lock-file="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702428 4717 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702438 4717 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702447 4717 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702462 4717 flags.go:64] FLAG: --log-json-split-stream="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702477 4717 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702487 4717 flags.go:64] FLAG: --log-text-split-stream="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702496 4717 flags.go:64] FLAG: --logging-format="text" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702506 4717 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702516 4717 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702525 4717 flags.go:64] FLAG: --manifest-url="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702534 4717 flags.go:64] FLAG: --manifest-url-header="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702547 4717 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702557 4717 flags.go:64] FLAG: --max-open-files="1000000" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702582 4717 flags.go:64] FLAG: --max-pods="110" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702591 4717 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702601 4717 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702610 4717 flags.go:64] FLAG: --memory-manager-policy="None" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702620 4717 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702629 4717 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702638 4717 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702648 4717 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702672 4717 flags.go:64] FLAG: --node-status-max-images="50" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702682 4717 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702691 4717 flags.go:64] FLAG: --oom-score-adj="-999" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702700 4717 flags.go:64] FLAG: --pod-cidr="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702709 4717 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702723 4717 flags.go:64] FLAG: --pod-manifest-path="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702732 4717 flags.go:64] FLAG: --pod-max-pids="-1" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702741 4717 flags.go:64] FLAG: --pods-per-core="0" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702750 4717 flags.go:64] FLAG: --port="10250" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702759 4717 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702768 4717 flags.go:64] FLAG: --provider-id="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702778 4717 flags.go:64] FLAG: --qos-reserved="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702786 4717 flags.go:64] FLAG: --read-only-port="10255" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702795 4717 flags.go:64] FLAG: --register-node="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702805 4717 flags.go:64] FLAG: --register-schedulable="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702813 4717 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702830 4717 flags.go:64] FLAG: --registry-burst="10" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702840 4717 flags.go:64] FLAG: --registry-qps="5" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702849 4717 flags.go:64] FLAG: --reserved-cpus="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702889 4717 flags.go:64] FLAG: --reserved-memory="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702903 4717 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702914 4717 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702924 4717 flags.go:64] FLAG: --rotate-certificates="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702933 4717 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702942 4717 flags.go:64] FLAG: --runonce="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702950 4717 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702960 4717 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702969 4717 flags.go:64] FLAG: --seccomp-default="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702978 4717 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702988 4717 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.702997 4717 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703007 4717 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703017 4717 flags.go:64] FLAG: --storage-driver-password="root" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703026 4717 flags.go:64] FLAG: --storage-driver-secure="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703035 4717 flags.go:64] FLAG: --storage-driver-table="stats" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703044 4717 flags.go:64] FLAG: --storage-driver-user="root" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703053 4717 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703063 4717 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703072 4717 flags.go:64] FLAG: --system-cgroups="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703081 4717 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703100 4717 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703109 4717 flags.go:64] FLAG: --tls-cert-file="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703119 4717 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703131 4717 flags.go:64] FLAG: --tls-min-version="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703140 4717 flags.go:64] FLAG: --tls-private-key-file="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703149 4717 flags.go:64] FLAG: --topology-manager-policy="none" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703158 4717 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703167 4717 flags.go:64] FLAG: --topology-manager-scope="container" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703178 4717 flags.go:64] FLAG: --v="2" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703190 4717 flags.go:64] FLAG: --version="false" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703202 4717 flags.go:64] FLAG: --vmodule="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703217 4717 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.703230 4717 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703478 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703490 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703504 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703513 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703522 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703531 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703540 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703550 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703558 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703567 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703577 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703585 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703593 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703602 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703610 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703618 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703627 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703635 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703643 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703653 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703663 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703674 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703682 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703691 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703700 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703708 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703716 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703725 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703734 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703742 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703750 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703758 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703766 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703774 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703784 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703792 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703800 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703808 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703817 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703826 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703833 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703841 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703849 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703886 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703894 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703905 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703915 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703924 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703933 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703942 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703951 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703959 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703969 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703977 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703986 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.703994 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704002 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704011 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704020 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704030 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704038 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704048 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704059 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704073 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704084 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704096 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704108 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704120 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704132 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704142 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.704153 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.704170 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.719270 4717 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.719345 4717 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719504 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719522 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719531 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719541 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719550 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719559 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719568 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719576 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719584 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719593 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719600 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719609 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719616 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719624 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719632 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719640 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719650 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719665 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719674 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719682 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719692 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719701 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719710 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719719 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719727 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719735 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719743 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719751 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719759 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719767 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719775 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719783 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719790 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719798 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719810 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719819 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719826 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719834 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719845 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719855 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719899 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719914 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719926 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719934 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719942 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719950 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719959 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719967 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719976 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719984 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.719991 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720000 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720008 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720016 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720024 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720031 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720039 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720047 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720055 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720062 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720070 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720077 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720085 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720092 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720100 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720108 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720119 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720129 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720139 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720148 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720171 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.720186 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720503 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720515 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720525 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720533 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720542 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720551 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720559 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720568 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720576 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720584 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720591 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720599 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720607 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720615 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720622 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720630 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720639 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720647 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720656 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720664 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720674 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720684 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720694 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720702 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720711 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720719 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720730 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720739 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720750 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720758 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720766 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720774 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720781 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720789 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720810 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720818 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720827 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720834 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720842 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720850 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720883 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720892 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720899 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720907 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720932 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720940 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720950 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720960 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720971 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720980 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720988 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.720996 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721003 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721011 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721019 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721026 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721034 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721042 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721052 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721063 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721072 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721080 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721089 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721097 4717 feature_gate.go:330] unrecognized feature gate: Example Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721106 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721117 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721125 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721133 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721141 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721149 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.721170 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.721184 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.721561 4717 server.go:940] "Client rotation is on, will bootstrap in background" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.728309 4717 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.728461 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.730361 4717 server.go:997] "Starting client certificate rotation" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.730414 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.731692 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-01 11:07:14.020346737 +0000 UTC Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.731811 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.761169 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.765115 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.765773 4717 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.786491 4717 log.go:25] "Validated CRI v1 runtime API" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.830256 4717 log.go:25] "Validated CRI v1 image API" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.832927 4717 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.838495 4717 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-21-21-41-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.838552 4717 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.868589 4717 manager.go:217] Machine: {Timestamp:2026-02-21 21:46:25.864933666 +0000 UTC m=+0.646467328 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec BootID:18974447-0c58-4cc9-b717-e1c8f74e7687 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bd:71:f7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bd:71:f7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1e:f7:0e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b3:7b:4e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:44:37:3a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f0:53:80 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7a:25:89:aa:09:0b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:66:c3:46:90:8a:40 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.869039 4717 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.869259 4717 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.872452 4717 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.872773 4717 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.872842 4717 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.874006 4717 topology_manager.go:138] "Creating topology manager with none policy" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.874039 4717 container_manager_linux.go:303] "Creating device plugin manager" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.874676 4717 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.874712 4717 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.874994 4717 state_mem.go:36] "Initialized new in-memory state store" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.875137 4717 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.881316 4717 kubelet.go:418] "Attempting to sync node with API server" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.881449 4717 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.881504 4717 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.881540 4717 kubelet.go:324] "Adding apiserver pod source" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.881570 4717 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.886532 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.886532 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.886743 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.886770 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.887247 4717 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.888527 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.892334 4717 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894018 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894107 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894129 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894148 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894178 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894200 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894217 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894246 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894265 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894282 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894306 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.894326 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.895438 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.896210 4717 server.go:1280] "Started kubelet" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.897578 4717 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.897602 4717 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.898558 4717 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 21 21:46:25 crc systemd[1]: Started Kubernetes Kubelet. Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.899359 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.908093 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.908172 4717 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.908476 4717 server.go:460] "Adding debug handlers to kubelet server" Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.909046 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.909064 4717 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.909124 4717 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.909101 4717 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.909623 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:19:01.724105905 +0000 UTC Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.913103 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.913365 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.913468 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.913922 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189661358435fe02 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 21:46:25.89616077 +0000 UTC m=+0.677694422,LastTimestamp:2026-02-21 21:46:25.89616077 +0000 UTC m=+0.677694422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.917554 4717 factory.go:55] Registering systemd factory Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.917614 4717 factory.go:221] Registration of the systemd container factory successfully Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.922389 4717 factory.go:153] Registering CRI-O factory Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.922450 4717 factory.go:221] Registration of the crio container factory successfully Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.922614 4717 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.922673 4717 factory.go:103] Registering Raw factory Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.922707 4717 manager.go:1196] Started watching for new ooms in manager Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.923938 4717 manager.go:319] Starting recovery of all containers Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931031 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931139 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931174 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931202 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931230 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931260 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931289 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931317 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931360 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931393 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931427 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931455 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931487 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931521 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931550 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931581 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931613 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931648 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931678 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931708 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931734 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931763 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931791 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931820 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931851 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931916 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931949 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.931980 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932011 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932045 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932073 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932105 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932134 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932162 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932201 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932234 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932260 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932290 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932318 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932350 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932383 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932411 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932445 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932475 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932504 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932537 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932571 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.932596 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933476 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933539 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933566 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933590 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933625 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933701 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933726 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933751 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933775 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933800 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933824 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933843 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933904 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933926 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933952 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933972 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.933992 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.934014 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.937130 4717 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938058 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938081 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938103 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938119 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938133 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938149 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938162 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938182 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938197 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938212 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938225 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938240 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938254 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938268 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938282 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938297 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938309 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938322 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938336 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938348 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938361 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938376 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938390 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938403 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938420 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938849 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938918 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938939 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938958 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938976 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.938993 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939008 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939026 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939043 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939065 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939084 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939101 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939123 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939148 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939168 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939188 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939205 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939220 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939236 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939251 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939268 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939283 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939297 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939311 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939323 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939337 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939353 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939366 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939382 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939395 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939409 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939422 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939437 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939453 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939468 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939483 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939498 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939512 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939526 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939540 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939554 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939568 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939581 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939594 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939614 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939632 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939652 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939674 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939691 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939706 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939719 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939734 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939749 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939764 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939776 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939790 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939803 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939815 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939830 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939842 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939880 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939894 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939907 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939932 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939945 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939959 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939971 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939984 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.939999 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940043 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940061 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940076 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940097 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940114 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940131 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940146 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940166 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940190 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940209 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940227 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940249 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940265 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940280 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940296 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940316 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940334 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940352 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940370 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940389 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940408 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940431 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940452 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940491 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940508 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940529 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940551 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940569 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940588 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940606 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940623 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940640 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940659 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940680 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940701 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940720 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940739 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940757 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940780 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940802 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940823 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940846 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940899 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940919 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940939 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940961 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.940978 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.941028 4717 reconstruct.go:97] "Volume reconstruction finished" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.941045 4717 reconciler.go:26] "Reconciler: start to sync state" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.948461 4717 manager.go:324] Recovery completed Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.970654 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.972974 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.974114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.974182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.974207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.974949 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.975025 4717 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.975065 4717 kubelet.go:2335] "Starting kubelet main sync loop" Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.975139 4717 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 21 21:46:25 crc kubenswrapper[4717]: W0221 21:46:25.976275 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:25 crc kubenswrapper[4717]: E0221 21:46:25.976405 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.976514 4717 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.976538 4717 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 21 21:46:25 crc kubenswrapper[4717]: I0221 21:46:25.976569 4717 state_mem.go:36] "Initialized new in-memory state store" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.004117 4717 policy_none.go:49] "None policy: Start" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.005752 4717 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.005832 4717 state_mem.go:35] "Initializing new in-memory state store" Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.010015 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.068378 4717 manager.go:334] "Starting Device Plugin manager" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.068496 4717 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.068529 4717 server.go:79] "Starting device plugin registration server" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.069517 4717 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.069570 4717 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.070502 4717 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.070755 4717 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.070780 4717 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.075744 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.075978 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.078297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.078379 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.078405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.078735 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.079484 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.079572 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.080315 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.080414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.080453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.080477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.080668 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.080896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.080981 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.081586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.081642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.081660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.084386 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.084428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.084443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.085041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.085102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.085129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.085328 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.085577 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.085695 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.086642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.086687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.086701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.086919 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087152 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087235 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.087974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.088161 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.088210 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.091050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.091088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.091101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.091142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.091183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.091212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.115178 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143482 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143585 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.143955 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144216 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144284 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.144621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.170470 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.172317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.172395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.172412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.172473 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.173573 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246221 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246280 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246491 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246480 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246645 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246671 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246730 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246695 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246780 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246819 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246795 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246934 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.246984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.247034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.247001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.247080 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.247116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.247222 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.374155 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.375602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.375635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.375647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.375675 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.376114 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.438138 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.459481 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.486835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.495912 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.498292 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:26 crc kubenswrapper[4717]: W0221 21:46:26.505665 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ac99d49ad6923c6d3c29bf9ee1dfac7640e6d4a9643e935fc10233b7d24e3039 WatchSource:0}: Error finding container ac99d49ad6923c6d3c29bf9ee1dfac7640e6d4a9643e935fc10233b7d24e3039: Status 404 returned error can't find the container with id ac99d49ad6923c6d3c29bf9ee1dfac7640e6d4a9643e935fc10233b7d24e3039 Feb 21 21:46:26 crc kubenswrapper[4717]: W0221 21:46:26.506985 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1bd43183790fb3802fc153d0f33ed2e5513b46c0b392750beae34418526b5dc9 WatchSource:0}: Error finding container 1bd43183790fb3802fc153d0f33ed2e5513b46c0b392750beae34418526b5dc9: Status 404 returned error can't find the container with id 1bd43183790fb3802fc153d0f33ed2e5513b46c0b392750beae34418526b5dc9 Feb 21 21:46:26 crc kubenswrapper[4717]: W0221 21:46:26.516718 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b87b2abf929c2f36676f7d011c6b1df8e34ff2392725d3a58ecfdca4eaabb428 WatchSource:0}: Error finding container b87b2abf929c2f36676f7d011c6b1df8e34ff2392725d3a58ecfdca4eaabb428: Status 404 returned error can't find the container with id b87b2abf929c2f36676f7d011c6b1df8e34ff2392725d3a58ecfdca4eaabb428 Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.517430 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Feb 21 21:46:26 crc kubenswrapper[4717]: W0221 21:46:26.527179 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e3a22b35f20a230c9890d240d600b39bfa60f5881b3fb26724d038197792eb14 WatchSource:0}: Error finding container e3a22b35f20a230c9890d240d600b39bfa60f5881b3fb26724d038197792eb14: Status 404 returned error can't find the container with id e3a22b35f20a230c9890d240d600b39bfa60f5881b3fb26724d038197792eb14 Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.776546 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.779658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.780295 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.780325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.780378 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:26 crc kubenswrapper[4717]: E0221 21:46:26.781402 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.900368 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.910394 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 12:46:54.673440465 +0000 UTC Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.979399 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1bd43183790fb3802fc153d0f33ed2e5513b46c0b392750beae34418526b5dc9"} Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.980683 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac99d49ad6923c6d3c29bf9ee1dfac7640e6d4a9643e935fc10233b7d24e3039"} Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.981790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c74231577911a97be644721090c128fd0ab17e725807936149b57f3444e6f6f9"} Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.982932 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e3a22b35f20a230c9890d240d600b39bfa60f5881b3fb26724d038197792eb14"} Feb 21 21:46:26 crc kubenswrapper[4717]: I0221 21:46:26.984025 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b87b2abf929c2f36676f7d011c6b1df8e34ff2392725d3a58ecfdca4eaabb428"} Feb 21 21:46:27 crc kubenswrapper[4717]: W0221 21:46:27.216652 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.216773 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:27 crc kubenswrapper[4717]: W0221 21:46:27.225577 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.225670 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:27 crc kubenswrapper[4717]: W0221 21:46:27.267530 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.267646 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.318445 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Feb 21 21:46:27 crc kubenswrapper[4717]: W0221 21:46:27.512313 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.512443 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.582505 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.585361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.585469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.585490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.585569 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.586670 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.896680 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 21 21:46:27 crc kubenswrapper[4717]: E0221 21:46:27.899696 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.900402 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.910765 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 15:52:31.249271387 +0000 UTC Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.989224 4717 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e" exitCode=0 Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.989425 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.989561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e"} Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.990598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.990666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.990688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.991268 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="185793e7e1f802a2b3276fcf193733a50ae99245b3c9dcfab11256e857b70eb0" exitCode=0 Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.991318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"185793e7e1f802a2b3276fcf193733a50ae99245b3c9dcfab11256e857b70eb0"} Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.991489 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.993642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.993680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.993693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.994770 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234" exitCode=0 Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.994816 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234"} Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.994959 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.996310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.996361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.996383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.998648 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.998850 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf"} Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.998942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff"} Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.998969 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd"} Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.999505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.999537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:27 crc kubenswrapper[4717]: I0221 21:46:27.999549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.001217 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f730b67d5e86f2248b208a0531cc5838750b086d4aa81eafb38791af7b683d62" exitCode=0 Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.001254 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f730b67d5e86f2248b208a0531cc5838750b086d4aa81eafb38791af7b683d62"} Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.001385 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.002227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.002255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.002267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.900668 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:28 crc kubenswrapper[4717]: E0221 21:46:28.908360 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189661358435fe02 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 21:46:25.89616077 +0000 UTC m=+0.677694422,LastTimestamp:2026-02-21 21:46:25.89616077 +0000 UTC m=+0.677694422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 21:46:28 crc kubenswrapper[4717]: I0221 21:46:28.911704 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:30:03.095419267 +0000 UTC Feb 21 21:46:28 crc kubenswrapper[4717]: E0221 21:46:28.919617 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.007895 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.007952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.007966 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.008077 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.009212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.009243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.009252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.013842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"955da883505373fb420d68cfa68a9cbb458ab8b9d419a5948b02ff33fe41b14e"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.013964 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.015765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.015808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.015839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.020212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.020240 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.020252 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.024724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.024783 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.025886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.025916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.025927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.028529 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7040d4b6e66eec274468a6da614a01d836f977669de37f0735d39b7fb6ad5b47" exitCode=0 Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.028570 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7040d4b6e66eec274468a6da614a01d836f977669de37f0735d39b7fb6ad5b47"} Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.028615 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.029300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.029446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.029458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.186835 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.188558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.188610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.188629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.188671 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:29 crc kubenswrapper[4717]: E0221 21:46:29.189494 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 21 21:46:29 crc kubenswrapper[4717]: W0221 21:46:29.255519 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 21 21:46:29 crc kubenswrapper[4717]: E0221 21:46:29.255650 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.436134 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:29 crc kubenswrapper[4717]: I0221 21:46:29.913186 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 07:43:48.141373575 +0000 UTC Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.037761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935"} Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.037840 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc"} Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.037981 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.039711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.039772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.039796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.041695 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e459081a802e9d60a5ee2f1b48cbd144b283561477c74f718bf6dfd680d5b56c" exitCode=0 Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.041852 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.041896 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.041965 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.041960 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e459081a802e9d60a5ee2f1b48cbd144b283561477c74f718bf6dfd680d5b56c"} Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.042092 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.041960 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.043990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.044010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.045729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.045751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.045764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:30 crc kubenswrapper[4717]: I0221 21:46:30.914120 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 20:15:32.902659134 +0000 UTC Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.051505 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"86446202e0d26ca2c2c3973d810ef949382d1981b62e3319b17fc4ff201378e7"} Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.051563 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9703a4c04285be2b1a9b9a1b681bd354bc2266389c4a64aa7e5bbcfb2d8906f8"} Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.051587 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"625bcd2589c9c533c58a35ce056454884d93dccfc75cbc494e6063b8d8fefb39"} Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.051637 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.051707 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.051721 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.053461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.053522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.053541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.053569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.053623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.053663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.321802 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.915123 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:09:38.259195829 +0000 UTC Feb 21 21:46:31 crc kubenswrapper[4717]: I0221 21:46:31.917311 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.062473 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.062523 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.062545 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.062459 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fdc6cf6818d83e685016e62a94546c4e7b5e405af261ee0269db9dc1be501e4c"} Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.062750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6bd653efc0f8ae7e2a2244598a51c694ea9f209e3419c9ad8ba135fbab546647"} Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.064176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.064222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.064235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.064445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.064519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.064550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.390048 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.391915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.391988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.392007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.392047 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.436675 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.436774 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.858431 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:32 crc kubenswrapper[4717]: I0221 21:46:32.915526 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 11:35:24.168375988 +0000 UTC Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.065268 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.065326 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.065435 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.066479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.066541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.066561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.068692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.068769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.068796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.891782 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 21 21:46:33 crc kubenswrapper[4717]: I0221 21:46:33.915992 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 03:42:55.202164766 +0000 UTC Feb 21 21:46:34 crc kubenswrapper[4717]: I0221 21:46:34.070410 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:34 crc kubenswrapper[4717]: I0221 21:46:34.072032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:34 crc kubenswrapper[4717]: I0221 21:46:34.072115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:34 crc kubenswrapper[4717]: I0221 21:46:34.072137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:34 crc kubenswrapper[4717]: I0221 21:46:34.916461 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 04:56:31.715341364 +0000 UTC Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.168516 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.168822 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.170612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.170681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.170704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.193069 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.815230 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.815565 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.817183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.817265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.817285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:35 crc kubenswrapper[4717]: I0221 21:46:35.917345 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:36:30.893113298 +0000 UTC Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.078137 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.079767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.079820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.079843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:36 crc kubenswrapper[4717]: E0221 21:46:36.080486 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.125475 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.125761 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.127475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.127587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.127632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.247344 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.255459 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.303650 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.303932 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.305727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.305803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.305826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:36 crc kubenswrapper[4717]: I0221 21:46:36.918473 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 00:45:57.504238212 +0000 UTC Feb 21 21:46:37 crc kubenswrapper[4717]: I0221 21:46:37.081375 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:37 crc kubenswrapper[4717]: I0221 21:46:37.083172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:37 crc kubenswrapper[4717]: I0221 21:46:37.083269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:37 crc kubenswrapper[4717]: I0221 21:46:37.083290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:37 crc kubenswrapper[4717]: I0221 21:46:37.088465 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:37 crc kubenswrapper[4717]: I0221 21:46:37.919132 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 02:46:41.759962379 +0000 UTC Feb 21 21:46:38 crc kubenswrapper[4717]: I0221 21:46:38.084744 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:38 crc kubenswrapper[4717]: I0221 21:46:38.086427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:38 crc kubenswrapper[4717]: I0221 21:46:38.086501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:38 crc kubenswrapper[4717]: I0221 21:46:38.086522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:38 crc kubenswrapper[4717]: I0221 21:46:38.920133 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 20:25:48.241979639 +0000 UTC Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.087852 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.089286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.089350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.089369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.161328 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.161448 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 21 21:46:39 crc kubenswrapper[4717]: W0221 21:46:39.475269 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.475582 4717 trace.go:236] Trace[2042685056]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 21:46:29.473) (total time: 10001ms): Feb 21 21:46:39 crc kubenswrapper[4717]: Trace[2042685056]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:46:39.475) Feb 21 21:46:39 crc kubenswrapper[4717]: Trace[2042685056]: [10.001514544s] [10.001514544s] END Feb 21 21:46:39 crc kubenswrapper[4717]: E0221 21:46:39.475737 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 21 21:46:39 crc kubenswrapper[4717]: W0221 21:46:39.751981 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.752129 4717 trace.go:236] Trace[1421062739]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 21:46:29.750) (total time: 10002ms): Feb 21 21:46:39 crc kubenswrapper[4717]: Trace[1421062739]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:46:39.751) Feb 21 21:46:39 crc kubenswrapper[4717]: Trace[1421062739]: [10.002020069s] [10.002020069s] END Feb 21 21:46:39 crc kubenswrapper[4717]: E0221 21:46:39.752165 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 21 21:46:39 crc kubenswrapper[4717]: W0221 21:46:39.762080 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.762303 4717 trace.go:236] Trace[166042441]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 21:46:29.760) (total time: 10002ms): Feb 21 21:46:39 crc kubenswrapper[4717]: Trace[166042441]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:46:39.762) Feb 21 21:46:39 crc kubenswrapper[4717]: Trace[166042441]: [10.002025918s] [10.002025918s] END Feb 21 21:46:39 crc kubenswrapper[4717]: E0221 21:46:39.762354 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.901160 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 21 21:46:39 crc kubenswrapper[4717]: I0221 21:46:39.920962 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 13:38:26.106100282 +0000 UTC Feb 21 21:46:40 crc kubenswrapper[4717]: I0221 21:46:40.669016 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 21 21:46:40 crc kubenswrapper[4717]: I0221 21:46:40.669096 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 21 21:46:40 crc kubenswrapper[4717]: I0221 21:46:40.674276 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 21 21:46:40 crc kubenswrapper[4717]: I0221 21:46:40.674351 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 21 21:46:40 crc kubenswrapper[4717]: I0221 21:46:40.922007 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 17:11:21.849551179 +0000 UTC Feb 21 21:46:41 crc kubenswrapper[4717]: I0221 21:46:41.923010 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:04:18.523190808 +0000 UTC Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.437057 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.437175 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.868854 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.869743 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.871591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.871655 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.871676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.876905 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:42 crc kubenswrapper[4717]: I0221 21:46:42.923138 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:43:34.522402789 +0000 UTC Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.101705 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.103069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.103123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.103141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.924036 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 10:51:48.130942916 +0000 UTC Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.932071 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.932389 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.935255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.935329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.935351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:43 crc kubenswrapper[4717]: I0221 21:46:43.955562 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.105307 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.106834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.106973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.107001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.381683 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.593690 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.615666 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.909035 4717 apiserver.go:52] "Watching apiserver" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.917715 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.918183 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.918689 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.918919 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.918964 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:44 crc kubenswrapper[4717]: E0221 21:46:44.919076 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:44 crc kubenswrapper[4717]: E0221 21:46:44.919117 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.919349 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.919419 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.919531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:44 crc kubenswrapper[4717]: E0221 21:46:44.919838 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.921847 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.922015 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.923373 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.923701 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.924204 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.924233 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.924261 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.924248 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 08:59:50.635541774 +0000 UTC Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.924275 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.928137 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.969034 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.987934 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:44 crc kubenswrapper[4717]: I0221 21:46:44.998757 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.007546 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.011370 4717 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.020438 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.029548 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.042428 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.056665 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.680083 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.680208 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.681489 4717 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.681633 4717 trace.go:236] Trace[922153096]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Feb-2026 21:46:35.088) (total time: 10592ms): Feb 21 21:46:45 crc kubenswrapper[4717]: Trace[922153096]: ---"Objects listed" error: 10592ms (21:46:45.681) Feb 21 21:46:45 crc kubenswrapper[4717]: Trace[922153096]: [10.592594158s] [10.592594158s] END Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.681674 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.683757 4717 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.727765 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51464->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.727849 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51464->192.168.126.11:17697: read: connection reset by peer" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.728279 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.728326 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782093 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782191 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782222 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782247 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782273 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782300 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782323 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782355 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782377 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782401 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782425 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782447 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782470 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782495 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782522 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782552 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782577 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782627 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782650 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782693 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782718 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782787 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782811 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782832 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782879 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782903 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782925 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782950 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782975 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.782999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783019 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783044 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783070 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783097 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783119 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783174 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783196 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783223 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783248 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783275 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783300 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783338 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783392 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783400 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783419 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783446 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783473 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783466 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783498 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783609 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783655 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783710 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783756 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783795 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783833 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783904 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783941 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783978 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784013 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784050 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784131 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784204 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784259 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784333 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784367 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784402 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784513 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784589 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784630 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784669 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784791 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784830 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784890 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784938 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784974 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785011 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785052 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785087 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785125 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785164 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785199 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785235 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785271 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785307 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785384 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785423 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785548 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785585 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785664 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785780 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785851 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785915 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785992 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786033 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786070 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786106 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786141 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786180 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786221 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786299 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786336 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786379 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786454 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786490 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786526 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786565 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786605 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786679 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786751 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786788 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786828 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786983 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787025 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787066 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787113 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787152 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787334 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787375 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787414 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787455 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787502 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787543 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787584 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787625 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787662 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787740 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787777 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787852 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787922 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787959 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787997 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788037 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788078 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788116 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788155 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788197 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788235 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788280 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788321 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788403 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788441 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788484 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788522 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788560 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788643 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788683 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788724 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788780 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788825 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788901 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788984 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789027 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789156 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789195 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789235 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789274 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789428 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789490 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789531 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789571 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789613 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789689 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789730 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789807 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789887 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790069 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790266 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790312 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790401 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790445 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790687 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790717 4717 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790745 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790773 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790797 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.792717 4717 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.796143 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.797559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783900 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783909 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.811825 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.812319 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:46.312216914 +0000 UTC m=+21.093750546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783953 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.783980 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784234 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784273 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784606 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784622 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784666 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784957 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.784987 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.813128 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785168 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785247 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785331 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785649 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785665 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785703 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785756 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.785847 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786215 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.786400 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.787252 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788233 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788551 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788570 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788838 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.788919 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789088 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789150 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.789792 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790406 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790628 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.791160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.790853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.791924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.791952 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.793070 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.793408 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.793916 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.794831 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.795464 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.795513 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.796305 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.796453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.796467 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.796926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.797067 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.797670 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.797994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798080 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.796995 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798303 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798509 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798680 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.799023 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.799068 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798607 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.799098 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.798738 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.810503 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.813998 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:46.313976478 +0000 UTC m=+21.095510090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.810847 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.810927 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.811090 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.811427 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.811496 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.811584 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.811623 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.811671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.812826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.812954 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.813004 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.813131 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.813734 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.814039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.814349 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.814596 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.815092 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.815109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.815787 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.816005 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.816124 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.816599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.816645 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.816886 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.817142 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.817695 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.817817 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.818226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.818482 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.818880 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.819333 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.819457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.819639 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.821646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.821691 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.822448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.822665 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.823239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.823446 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.823458 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.824218 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.824374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.824558 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.824598 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.824622 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.824707 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:46.324680069 +0000 UTC m=+21.106213731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.825441 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.826166 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.827412 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.829022 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.830240 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:46:46.330209169 +0000 UTC m=+21.111742831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.832909 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.833174 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.833196 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.833212 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:45 crc kubenswrapper[4717]: E0221 21:46:45.833287 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:46.333263276 +0000 UTC m=+21.114796918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.833761 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.834727 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.836625 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.837099 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.837646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.837826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.837938 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.838067 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.838124 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.838184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.838042 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.841316 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.841401 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.844636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.845058 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.845627 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.845651 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.845985 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846394 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846495 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846606 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846914 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.846929 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.847153 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.847358 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.847738 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.847769 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.848047 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.849814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.850490 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.851294 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.851364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.851417 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.851699 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.851892 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.852299 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.854411 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.855011 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.855448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.855555 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.855934 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.856233 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.856260 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.856472 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.859752 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860015 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860306 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860632 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860899 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.861141 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.861184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860933 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.860969 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.861376 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.861451 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.861578 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.861928 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.863435 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.867021 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.867643 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.867643 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.868032 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.868199 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.869472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.869563 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.869801 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.870342 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.870439 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.870878 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.873446 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.880010 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.881201 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.881631 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.882483 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.884502 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.887667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: W0221 21:46:45.887650 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9f13bc3bad4df5a116326cc4ab0793454f81c84b35e9ee2db31a10acc9c37cc7 WatchSource:0}: Error finding container 9f13bc3bad4df5a116326cc4ab0793454f81c84b35e9ee2db31a10acc9c37cc7: Status 404 returned error can't find the container with id 9f13bc3bad4df5a116326cc4ab0793454f81c84b35e9ee2db31a10acc9c37cc7 Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.895960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.896702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.899459 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.899612 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.901127 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.913787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914037 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914058 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914069 4717 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914080 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914088 4717 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914099 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914111 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914121 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914131 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914141 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914213 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914306 4717 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914349 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914367 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914431 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914447 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914470 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914484 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914498 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914513 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914526 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914543 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914558 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914572 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914585 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914599 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914612 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914625 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914639 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914652 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914666 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914679 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914692 4717 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914705 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914719 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914733 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914746 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914759 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914772 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914784 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914799 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914812 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914824 4717 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914837 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914853 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914894 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914910 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914929 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914947 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914964 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.914981 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915009 4717 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915022 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915035 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915048 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915062 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915074 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915087 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915102 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915115 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915128 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915140 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915152 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915166 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915179 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915193 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915205 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915219 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915231 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915244 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915257 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.915270 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.916991 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917035 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917049 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917064 4717 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917078 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917090 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917102 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917115 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917128 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917139 4717 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917151 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917165 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917177 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917188 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917200 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917210 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917222 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917234 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917261 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917274 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917289 4717 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917304 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917322 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917337 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917349 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917360 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917396 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917409 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917421 4717 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917434 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917446 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917457 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917468 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917479 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917491 4717 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917502 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917513 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917535 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917546 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917558 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917569 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917581 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917593 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917604 4717 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917617 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917630 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917646 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917663 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917675 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917689 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917703 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917714 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917725 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917737 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917748 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917759 4717 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917770 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917781 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917796 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917814 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917830 4717 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917844 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917878 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917891 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917909 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917925 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917937 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917949 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917960 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917971 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.917995 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918007 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918019 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918031 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918046 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918062 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918077 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918090 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918102 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918114 4717 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918126 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918139 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918153 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918169 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918185 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918196 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918208 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918631 4717 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918646 4717 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918656 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918667 4717 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918703 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918713 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918724 4717 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918736 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918748 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918762 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918774 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918787 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918798 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918809 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918820 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918831 4717 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918845 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918855 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918888 4717 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918900 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918913 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918924 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918936 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918947 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918957 4717 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918968 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918979 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.918990 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.919000 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.928438 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 10:26:16.925857288 +0000 UTC Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.928548 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.939278 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.985888 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.986234 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.986499 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.988814 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.990619 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.991878 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.992376 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.993485 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.994127 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.995254 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.996097 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.998133 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 21 21:46:45 crc kubenswrapper[4717]: I0221 21:46:45.999698 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.001349 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.002169 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.002647 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.003148 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.005756 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.006877 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.008038 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.008934 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.009618 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.010747 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.011381 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.011904 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.013537 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.014272 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.015252 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.015387 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.016312 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.017534 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.018126 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.019017 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.019455 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.019591 4717 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.019704 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.019801 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.021832 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.022414 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.022818 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.024513 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.025327 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.025570 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.028259 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.029834 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.032602 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.033707 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.035989 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.037008 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.037496 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.039570 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.040603 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.042691 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.043915 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.046460 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.047659 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.048560 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.049599 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.050731 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.052147 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.054289 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.055759 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.115299 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.120103 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc" exitCode=255 Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.120209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc"} Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.123252 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca"} Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.123316 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9f13bc3bad4df5a116326cc4ab0793454f81c84b35e9ee2db31a10acc9c37cc7"} Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.134541 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.135334 4717 scope.go:117] "RemoveContainer" containerID="77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.137179 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.149088 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.157299 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.170773 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.184080 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.186205 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: W0221 21:46:46.200725 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7cc555902db3d3319da5d8d6ba5888f958d691755a7e3b937408dc364017185b WatchSource:0}: Error finding container 7cc555902db3d3319da5d8d6ba5888f958d691755a7e3b937408dc364017185b: Status 404 returned error can't find the container with id 7cc555902db3d3319da5d8d6ba5888f958d691755a7e3b937408dc364017185b Feb 21 21:46:46 crc kubenswrapper[4717]: W0221 21:46:46.202110 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b19fd66f845c9e3bdf6e8f9c4a9a91292fdeb943b56ea9cd47c7514707c21243 WatchSource:0}: Error finding container b19fd66f845c9e3bdf6e8f9c4a9a91292fdeb943b56ea9cd47c7514707c21243: Status 404 returned error can't find the container with id b19fd66f845c9e3bdf6e8f9c4a9a91292fdeb943b56ea9cd47c7514707c21243 Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.212211 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.235096 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.322107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.322150 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.322216 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.322258 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:47.322246481 +0000 UTC m=+22.103780103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.322357 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.322515 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:47.322479647 +0000 UTC m=+22.104013439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.423424 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.423501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.423529 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423606 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:46:47.423583155 +0000 UTC m=+22.205116777 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423637 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423654 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423667 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423722 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:47.423708858 +0000 UTC m=+22.205242480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423804 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423839 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423854 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.423951 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:47.423922673 +0000 UTC m=+22.205456295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.929002 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 15:13:15.80771229 +0000 UTC Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.975928 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.976102 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.976209 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:46 crc kubenswrapper[4717]: I0221 21:46:46.976327 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.976473 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:46 crc kubenswrapper[4717]: E0221 21:46:46.976681 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.127372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367"} Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.128628 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b19fd66f845c9e3bdf6e8f9c4a9a91292fdeb943b56ea9cd47c7514707c21243"} Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.129957 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d"} Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.130014 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7cc555902db3d3319da5d8d6ba5888f958d691755a7e3b937408dc364017185b"} Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.131375 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.133108 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147"} Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.133458 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.154444 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.172054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.195584 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.218703 4717 csr.go:261] certificate signing request csr-cfbpr is approved, waiting to be issued Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.222307 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.241118 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.253912 4717 csr.go:257] certificate signing request csr-cfbpr is issued Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.274049 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.309165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.332275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.332352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.332450 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.332503 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.332562 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:49.332541001 +0000 UTC m=+24.114074623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.332583 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:49.332574872 +0000 UTC m=+24.114108494 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.342486 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.346500 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dg4jx"] Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.346830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.356129 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 21 21:46:47 crc kubenswrapper[4717]: W0221 21:46:47.356939 4717 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.356974 4717 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.357038 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.375707 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.413935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.433152 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.433261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0abf9538-30e5-4e8e-8084-ecf9eee7e364-hosts-file\") pod \"node-resolver-dg4jx\" (UID: \"0abf9538-30e5-4e8e-8084-ecf9eee7e364\") " pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.433305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9c55\" (UniqueName: \"kubernetes.io/projected/0abf9538-30e5-4e8e-8084-ecf9eee7e364-kube-api-access-v9c55\") pod \"node-resolver-dg4jx\" (UID: \"0abf9538-30e5-4e8e-8084-ecf9eee7e364\") " pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.433340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.433364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433529 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433547 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433563 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433573 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433576 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433589 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433646 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:49.433620387 +0000 UTC m=+24.215154009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.433668 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:49.433660438 +0000 UTC m=+24.215194060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:47 crc kubenswrapper[4717]: E0221 21:46:47.434001 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:46:49.433991136 +0000 UTC m=+24.215524758 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.436553 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.493423 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.515054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.528826 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.534683 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0abf9538-30e5-4e8e-8084-ecf9eee7e364-hosts-file\") pod \"node-resolver-dg4jx\" (UID: \"0abf9538-30e5-4e8e-8084-ecf9eee7e364\") " pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.534728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9c55\" (UniqueName: \"kubernetes.io/projected/0abf9538-30e5-4e8e-8084-ecf9eee7e364-kube-api-access-v9c55\") pod \"node-resolver-dg4jx\" (UID: \"0abf9538-30e5-4e8e-8084-ecf9eee7e364\") " pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.534973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0abf9538-30e5-4e8e-8084-ecf9eee7e364-hosts-file\") pod \"node-resolver-dg4jx\" (UID: \"0abf9538-30e5-4e8e-8084-ecf9eee7e364\") " pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.543165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.553836 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9c55\" (UniqueName: \"kubernetes.io/projected/0abf9538-30e5-4e8e-8084-ecf9eee7e364-kube-api-access-v9c55\") pod \"node-resolver-dg4jx\" (UID: \"0abf9538-30e5-4e8e-8084-ecf9eee7e364\") " pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.560444 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.573504 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.586075 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.596891 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.610280 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.630361 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.664568 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:47Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:47 crc kubenswrapper[4717]: I0221 21:46:47.930476 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 11:47:34.304789323 +0000 UTC Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.216356 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-flt22"] Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.216818 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.218754 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bzd94"] Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.219022 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.219607 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.219756 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.220353 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.222119 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.222456 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.222641 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.222726 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.222828 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7ndm2"] Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.222959 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.223072 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.223420 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.223625 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-82vcj"] Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.223851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.224991 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.225909 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.225951 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.226123 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.226241 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.230585 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.236353 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.236432 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.236603 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.240336 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.251011 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.254954 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-21 21:41:47 +0000 UTC, rotation deadline is 2026-11-19 01:09:31.288531852 +0000 UTC Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.255008 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6483h22m43.033525923s for next certificate rotation Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.268250 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.291282 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.303436 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.319584 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.336808 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-socket-dir-parent\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-daemon-config\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-netns\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340317 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-os-release\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-netns\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340365 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-bin\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340389 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovn-node-metrics-cert\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340630 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-k8s-cni-cncf-io\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-slash\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc5eeb62-90d6-4f10-9b58-f147b23eb775-proxy-tls\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-os-release\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-hostroot\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340816 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-ovn\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-node-log\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340884 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-netd\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-system-cni-dir\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-cni-binary-copy\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340975 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d99f8dec-a80d-4890-b903-fe05d6d47d62-cni-binary-copy\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.340997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-system-cni-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-cni-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341065 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-kubelet\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341099 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-etc-kubernetes\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341119 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-kubelet\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-systemd\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341163 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-var-lib-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341187 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-conf-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-cnibin\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-cni-bin\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341246 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-etc-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341270 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-cnibin\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-config\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341315 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d99f8dec-a80d-4890-b903-fe05d6d47d62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341332 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-cni-multus\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341363 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnjzh\" (UniqueName: \"kubernetes.io/projected/cc5eeb62-90d6-4f10-9b58-f147b23eb775-kube-api-access-gnjzh\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxjc\" (UniqueName: \"kubernetes.io/projected/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-kube-api-access-5hxjc\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341401 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-systemd-units\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341417 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-log-socket\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341435 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-script-lib\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341456 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-env-overrides\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341498 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cgg\" (UniqueName: \"kubernetes.io/projected/d99f8dec-a80d-4890-b903-fe05d6d47d62-kube-api-access-56cgg\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-multus-certs\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341539 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78fqm\" (UniqueName: \"kubernetes.io/projected/f6a10be9-c25d-42c3-9a4f-e2397cc64852-kube-api-access-78fqm\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc5eeb62-90d6-4f10-9b58-f147b23eb775-rootfs\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.341577 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc5eeb62-90d6-4f10-9b58-f147b23eb775-mcd-auth-proxy-config\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.354566 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.387989 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.415509 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.432995 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442066 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-k8s-cni-cncf-io\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-slash\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-k8s-cni-cncf-io\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442283 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc5eeb62-90d6-4f10-9b58-f147b23eb775-proxy-tls\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442238 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-slash\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-os-release\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442430 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442650 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-hostroot\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-os-release\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.442354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-hostroot\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-ovn\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443108 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-node-log\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-ovn\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-system-cni-dir\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-node-log\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-cni-binary-copy\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443298 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-system-cni-dir\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-netd\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d99f8dec-a80d-4890-b903-fe05d6d47d62-cni-binary-copy\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443361 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443379 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-system-cni-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-cni-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-kubelet\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443696 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-conf-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-kubelet\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-system-cni-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443420 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-netd\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-etc-kubernetes\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443911 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-etc-kubernetes\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443924 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-conf-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443977 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-kubelet\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.443953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-kubelet\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444012 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-systemd\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444032 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-var-lib-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444056 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-cnibin\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444077 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-cni-bin\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-etc-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444105 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-systemd\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d99f8dec-a80d-4890-b903-fe05d6d47d62-cni-binary-copy\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-cni-bin\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-var-lib-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444193 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-cnibin\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444092 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-cni-dir\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444145 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-cnibin\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-cnibin\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444200 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-etc-openvswitch\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444288 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-config\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d99f8dec-a80d-4890-b903-fe05d6d47d62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-cni-multus\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnjzh\" (UniqueName: \"kubernetes.io/projected/cc5eeb62-90d6-4f10-9b58-f147b23eb775-kube-api-access-gnjzh\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxjc\" (UniqueName: \"kubernetes.io/projected/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-kube-api-access-5hxjc\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-systemd-units\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444449 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-var-lib-cni-multus\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444470 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-log-socket\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-script-lib\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-systemd-units\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444519 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-env-overrides\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc5eeb62-90d6-4f10-9b58-f147b23eb775-rootfs\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc5eeb62-90d6-4f10-9b58-f147b23eb775-mcd-auth-proxy-config\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cgg\" (UniqueName: \"kubernetes.io/projected/d99f8dec-a80d-4890-b903-fe05d6d47d62-kube-api-access-56cgg\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444660 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-multus-certs\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444680 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78fqm\" (UniqueName: \"kubernetes.io/projected/f6a10be9-c25d-42c3-9a4f-e2397cc64852-kube-api-access-78fqm\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444704 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-socket-dir-parent\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444761 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-daemon-config\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444785 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-netns\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-os-release\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-netns\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-bin\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovn-node-metrics-cert\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445009 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-log-socket\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.444996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc5eeb62-90d6-4f10-9b58-f147b23eb775-rootfs\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445045 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-ovn-kubernetes\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-netns\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445065 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d99f8dec-a80d-4890-b903-fe05d6d47d62-os-release\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445135 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-netns\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-host-run-multus-certs\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-bin\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445220 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-socket-dir-parent\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445543 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-env-overrides\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-config\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc5eeb62-90d6-4f10-9b58-f147b23eb775-mcd-auth-proxy-config\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.445744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d99f8dec-a80d-4890-b903-fe05d6d47d62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.446091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-script-lib\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.446276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-multus-daemon-config\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.446412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-cni-binary-copy\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.446941 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc5eeb62-90d6-4f10-9b58-f147b23eb775-proxy-tls\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.448384 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovn-node-metrics-cert\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.484388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxjc\" (UniqueName: \"kubernetes.io/projected/d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da-kube-api-access-5hxjc\") pod \"multus-bzd94\" (UID: \"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\") " pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.484810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cgg\" (UniqueName: \"kubernetes.io/projected/d99f8dec-a80d-4890-b903-fe05d6d47d62-kube-api-access-56cgg\") pod \"multus-additional-cni-plugins-82vcj\" (UID: \"d99f8dec-a80d-4890-b903-fe05d6d47d62\") " pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.501379 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78fqm\" (UniqueName: \"kubernetes.io/projected/f6a10be9-c25d-42c3-9a4f-e2397cc64852-kube-api-access-78fqm\") pod \"ovnkube-node-7ndm2\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.504709 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnjzh\" (UniqueName: \"kubernetes.io/projected/cc5eeb62-90d6-4f10-9b58-f147b23eb775-kube-api-access-gnjzh\") pod \"machine-config-daemon-flt22\" (UID: \"cc5eeb62-90d6-4f10-9b58-f147b23eb775\") " pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.511153 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.542557 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bzd94" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.543600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.549884 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.554917 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.558964 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-82vcj" Feb 21 21:46:48 crc kubenswrapper[4717]: W0221 21:46:48.562739 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5eeb62_90d6_4f10_9b58_f147b23eb775.slice/crio-7457b6f9a843c1451d24eb5d31e6946f3ee6f2d4ec5695ea3542b2c037c40d11 WatchSource:0}: Error finding container 7457b6f9a843c1451d24eb5d31e6946f3ee6f2d4ec5695ea3542b2c037c40d11: Status 404 returned error can't find the container with id 7457b6f9a843c1451d24eb5d31e6946f3ee6f2d4ec5695ea3542b2c037c40d11 Feb 21 21:46:48 crc kubenswrapper[4717]: W0221 21:46:48.569253 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8a3d061_4ed4_4ea5_83d7_d2a74b8bf5da.slice/crio-c47eba04883b4b2c67816d27c281a7c9f885ba19c92a3096dd0daa0d64e590d1 WatchSource:0}: Error finding container c47eba04883b4b2c67816d27c281a7c9f885ba19c92a3096dd0daa0d64e590d1: Status 404 returned error can't find the container with id c47eba04883b4b2c67816d27c281a7c9f885ba19c92a3096dd0daa0d64e590d1 Feb 21 21:46:48 crc kubenswrapper[4717]: W0221 21:46:48.580076 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a10be9_c25d_42c3_9a4f_e2397cc64852.slice/crio-25c7fbb54410bde86b43a7110b201a47b7e85333bda8777ca328f34602963bd6 WatchSource:0}: Error finding container 25c7fbb54410bde86b43a7110b201a47b7e85333bda8777ca328f34602963bd6: Status 404 returned error can't find the container with id 25c7fbb54410bde86b43a7110b201a47b7e85333bda8777ca328f34602963bd6 Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.608388 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: W0221 21:46:48.615456 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd99f8dec_a80d_4890_b903_fe05d6d47d62.slice/crio-36868292bd588865379cf25d44bbd3a9f9d85384258bcce0a0f9a7e1a6714f75 WatchSource:0}: Error finding container 36868292bd588865379cf25d44bbd3a9f9d85384258bcce0a0f9a7e1a6714f75: Status 404 returned error can't find the container with id 36868292bd588865379cf25d44bbd3a9f9d85384258bcce0a0f9a7e1a6714f75 Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.654316 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.659311 4717 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-dns/node-resolver-dg4jx" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.659487 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dg4jx" Feb 21 21:46:48 crc kubenswrapper[4717]: W0221 21:46:48.672370 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0abf9538_30e5_4e8e_8084_ecf9eee7e364.slice/crio-48bdf9e8791baa241b1c4f046fa49e172c6a6403dd43154b6b4b7e28ebc6b573 WatchSource:0}: Error finding container 48bdf9e8791baa241b1c4f046fa49e172c6a6403dd43154b6b4b7e28ebc6b573: Status 404 returned error can't find the container with id 48bdf9e8791baa241b1c4f046fa49e172c6a6403dd43154b6b4b7e28ebc6b573 Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.676507 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.693605 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.712093 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.726264 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.743045 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.765775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.784331 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:48Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.930666 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 12:38:22.170174361 +0000 UTC Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.932155 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.975748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.975789 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:48 crc kubenswrapper[4717]: I0221 21:46:48.975770 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:48 crc kubenswrapper[4717]: E0221 21:46:48.975937 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:48 crc kubenswrapper[4717]: E0221 21:46:48.976516 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:48 crc kubenswrapper[4717]: E0221 21:46:48.976609 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.140304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dg4jx" event={"ID":"0abf9538-30e5-4e8e-8084-ecf9eee7e364","Type":"ContainerStarted","Data":"a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.140352 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dg4jx" event={"ID":"0abf9538-30e5-4e8e-8084-ecf9eee7e364","Type":"ContainerStarted","Data":"48bdf9e8791baa241b1c4f046fa49e172c6a6403dd43154b6b4b7e28ebc6b573"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.142942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.142971 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.142982 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"7457b6f9a843c1451d24eb5d31e6946f3ee6f2d4ec5695ea3542b2c037c40d11"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.144651 4717 generic.go:334] "Generic (PLEG): container finished" podID="d99f8dec-a80d-4890-b903-fe05d6d47d62" containerID="7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950" exitCode=0 Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.144693 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerDied","Data":"7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.144707 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerStarted","Data":"36868292bd588865379cf25d44bbd3a9f9d85384258bcce0a0f9a7e1a6714f75"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.147237 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c" exitCode=0 Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.147280 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.147297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"25c7fbb54410bde86b43a7110b201a47b7e85333bda8777ca328f34602963bd6"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.149253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerStarted","Data":"938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.149361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerStarted","Data":"c47eba04883b4b2c67816d27c281a7c9f885ba19c92a3096dd0daa0d64e590d1"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.154230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318"} Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.173126 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.191124 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.221466 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.243160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.262066 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.273877 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.302325 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.331939 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.357206 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.360487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.360558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.360692 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.360768 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:53.360746631 +0000 UTC m=+28.142280253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.361242 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.361284 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:53.361275005 +0000 UTC m=+28.142808637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.404505 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.424500 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.436588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.439602 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.443746 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.446764 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.460609 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.461073 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461167 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:46:53.46114905 +0000 UTC m=+28.242682672 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.461204 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.461268 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461376 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461396 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461400 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461411 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461416 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461423 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461456 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:53.461449438 +0000 UTC m=+28.242983060 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:49 crc kubenswrapper[4717]: E0221 21:46:49.461472 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:46:53.461465338 +0000 UTC m=+28.242998960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.475290 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.490711 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.503351 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.539772 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.553569 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.570308 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.587508 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.602009 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.618781 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.645103 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.666156 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.688302 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.706659 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.727787 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.744184 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.759205 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.769780 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.787234 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.805256 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.818721 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.838709 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.855445 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.876510 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.890958 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:49Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:49 crc kubenswrapper[4717]: I0221 21:46:49.931549 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:36:08.929402805 +0000 UTC Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.163725 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09"} Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.164423 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf"} Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.164521 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156"} Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.164604 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac"} Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.164743 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563"} Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.164828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723"} Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.166477 4717 generic.go:334] "Generic (PLEG): container finished" podID="d99f8dec-a80d-4890-b903-fe05d6d47d62" containerID="d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872" exitCode=0 Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.167237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerDied","Data":"d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872"} Feb 21 21:46:50 crc kubenswrapper[4717]: E0221 21:46:50.179790 4717 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.200871 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.219559 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.239595 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.262188 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.292586 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.313649 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.330386 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.347833 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.361916 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.375952 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.402117 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.416297 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.430245 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:50Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.932101 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 19:06:08.624761817 +0000 UTC Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.975432 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.975463 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:50 crc kubenswrapper[4717]: I0221 21:46:50.975538 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:50 crc kubenswrapper[4717]: E0221 21:46:50.975609 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:50 crc kubenswrapper[4717]: E0221 21:46:50.975755 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:50 crc kubenswrapper[4717]: E0221 21:46:50.975937 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.173924 4717 generic.go:334] "Generic (PLEG): container finished" podID="d99f8dec-a80d-4890-b903-fe05d6d47d62" containerID="6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b" exitCode=0 Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.174031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerDied","Data":"6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b"} Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.200020 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.221434 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.240761 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.258494 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.275186 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.293192 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.312996 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.326651 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.340268 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7l5s2"] Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.340770 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.344121 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.344198 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.344320 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.346979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.348574 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.364923 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.379784 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.397526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.425405 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.443046 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.459702 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.478538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.482829 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/058926f7-024e-464a-96a7-3e96a96affc7-host\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.482893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/058926f7-024e-464a-96a7-3e96a96affc7-serviceca\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.483232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vm86\" (UniqueName: \"kubernetes.io/projected/058926f7-024e-464a-96a7-3e96a96affc7-kube-api-access-8vm86\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.496579 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.523621 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.540802 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.561790 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.577606 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.583772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vm86\" (UniqueName: \"kubernetes.io/projected/058926f7-024e-464a-96a7-3e96a96affc7-kube-api-access-8vm86\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.584147 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/058926f7-024e-464a-96a7-3e96a96affc7-host\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.584224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/058926f7-024e-464a-96a7-3e96a96affc7-host\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.584276 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/058926f7-024e-464a-96a7-3e96a96affc7-serviceca\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.586402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/058926f7-024e-464a-96a7-3e96a96affc7-serviceca\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.597277 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.612147 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vm86\" (UniqueName: \"kubernetes.io/projected/058926f7-024e-464a-96a7-3e96a96affc7-kube-api-access-8vm86\") pod \"node-ca-7l5s2\" (UID: \"058926f7-024e-464a-96a7-3e96a96affc7\") " pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.617101 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.633162 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.658067 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.673635 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7l5s2" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.679486 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: W0221 21:46:51.696752 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058926f7_024e_464a_96a7_3e96a96affc7.slice/crio-41d3f5a48ed284fe952c5bff9c7808c8f645cf92275c7a2950e212a9f4ee4650 WatchSource:0}: Error finding container 41d3f5a48ed284fe952c5bff9c7808c8f645cf92275c7a2950e212a9f4ee4650: Status 404 returned error can't find the container with id 41d3f5a48ed284fe952c5bff9c7808c8f645cf92275c7a2950e212a9f4ee4650 Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.710346 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:51Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:51 crc kubenswrapper[4717]: I0221 21:46:51.933309 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 16:22:32.661686372 +0000 UTC Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.081202 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.084185 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.084221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.084233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.084342 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.095226 4717 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.095451 4717 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.096653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.096723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.096740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.096766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.096784 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.122278 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.140506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.140572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.140586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.140609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.140624 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.156353 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.160229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.160274 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.160285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.160307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.160319 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.175784 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180266 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180528 4717 generic.go:334] "Generic (PLEG): container finished" podID="d99f8dec-a80d-4890-b903-fe05d6d47d62" containerID="768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3" exitCode=0 Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.180605 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerDied","Data":"768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.185351 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.187340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7l5s2" event={"ID":"058926f7-024e-464a-96a7-3e96a96affc7","Type":"ContainerStarted","Data":"5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.187380 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7l5s2" event={"ID":"058926f7-024e-464a-96a7-3e96a96affc7","Type":"ContainerStarted","Data":"41d3f5a48ed284fe952c5bff9c7808c8f645cf92275c7a2950e212a9f4ee4650"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.205230 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.205225 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.209008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.209043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.209057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.209075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.209089 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.223948 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.241153 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.241346 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.243209 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.244018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.244056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.244069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.244089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.244104 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.262204 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.279382 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.295688 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.309538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.323711 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.338835 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.346855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.346916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.346934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.346955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.346967 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.355748 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.373003 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.389744 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.409416 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.421009 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.473793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.473836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.473848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.473882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.473897 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.478275 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.570538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.578709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.578736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.578745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.578760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.578771 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.602785 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.616297 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.629307 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.639293 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.659267 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.670248 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.681278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.681421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.681498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.681580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.681643 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.685415 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.695890 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.703779 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.716246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.731290 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.744552 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:52Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.785062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.785119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.785130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.785176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.785189 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.888374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.888424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.888437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.888457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.888472 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.933942 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:08:42.499432764 +0000 UTC Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.975512 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.975572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.975759 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.975906 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.976168 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:52 crc kubenswrapper[4717]: E0221 21:46:52.975966 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.991549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.992011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.992075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.992145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:52 crc kubenswrapper[4717]: I0221 21:46:52.992205 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:52Z","lastTransitionTime":"2026-02-21T21:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.101292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.101639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.101798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.101958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.102251 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.196047 4717 generic.go:334] "Generic (PLEG): container finished" podID="d99f8dec-a80d-4890-b903-fe05d6d47d62" containerID="efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee" exitCode=0 Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.196109 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerDied","Data":"efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.212114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.212184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.212209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.212239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.212258 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.223022 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.242521 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.264090 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.282529 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.302222 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.315383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.315448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.315468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.315498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.315525 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.325849 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.362256 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.380607 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.403491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.403603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.403664 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.403746 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.403846 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:01.403819195 +0000 UTC m=+36.185352857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.403935 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.404219 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:01.404142033 +0000 UTC m=+36.185675695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.418061 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.418313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.418374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.418386 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.418407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.418436 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.436590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.451097 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.467811 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.491905 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.504230 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.504491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.504523 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:01.504480141 +0000 UTC m=+36.286013803 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.504674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.504720 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.504757 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.504782 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.504884 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:01.504830199 +0000 UTC m=+36.286363861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.505038 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.505088 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.505108 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:53 crc kubenswrapper[4717]: E0221 21:46:53.505213 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:01.505176338 +0000 UTC m=+36.286709970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.521251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.521317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.521335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.521362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.521383 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.624194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.624267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.624287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.624316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.624337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.727912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.727983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.728001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.728032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.728051 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.831955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.832041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.832066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.832101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.832131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.934418 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 15:30:25.631267049 +0000 UTC Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.934738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.934766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.934778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.934793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:53 crc kubenswrapper[4717]: I0221 21:46:53.934821 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:53Z","lastTransitionTime":"2026-02-21T21:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.040564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.040641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.040664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.040693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.040717 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.144620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.144723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.144749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.144779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.144803 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.208773 4717 generic.go:334] "Generic (PLEG): container finished" podID="d99f8dec-a80d-4890-b903-fe05d6d47d62" containerID="486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d" exitCode=0 Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.208855 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerDied","Data":"486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.229239 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.247713 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.250081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.250289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.250306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.250330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.250347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.263160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.289770 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.325551 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.340555 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.354430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.354486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.354501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.354534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.354558 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.363441 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.383496 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.406424 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.425497 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.449466 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.460399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.460446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.460461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.460482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.460499 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.475159 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.494957 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.518364 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:54Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.565155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.565235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.565258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.565292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.565318 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.675555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.676014 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.676031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.676056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.676079 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.779989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.780041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.780058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.780085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.780105 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.884877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.884917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.884931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.884951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.884966 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.934939 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 10:43:22.09659947 +0000 UTC Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.975523 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.975670 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:54 crc kubenswrapper[4717]: E0221 21:46:54.975830 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.976111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:54 crc kubenswrapper[4717]: E0221 21:46:54.976367 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:54 crc kubenswrapper[4717]: E0221 21:46:54.976645 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.988025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.988086 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.988096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.988117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:54 crc kubenswrapper[4717]: I0221 21:46:54.988130 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:54Z","lastTransitionTime":"2026-02-21T21:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.093583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.093628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.093639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.093657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.093668 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.196790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.197346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.197428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.197502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.197568 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.219515 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" event={"ID":"d99f8dec-a80d-4890-b903-fe05d6d47d62","Type":"ContainerStarted","Data":"abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.226693 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.227284 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.240350 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.257576 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.257981 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.271240 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.287293 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.300921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.300980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.301000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.301021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.301033 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.301937 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.317485 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.333487 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.351775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.372205 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.390474 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.404212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.404279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.404292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.404330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.404344 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.406587 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.422995 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.448171 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.464810 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.480097 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.498314 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.507731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.507839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.507890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.507916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.507931 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.515757 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.532365 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.554503 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.579992 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.598370 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.611676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.611745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.611759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.611777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.611789 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.613834 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.635552 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.651926 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.666009 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.681120 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.695763 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.714279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.714336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.714349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.714369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.714382 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.715533 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.731687 4717 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.817301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.817410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.817467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.817500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.817556 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.921480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.921636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.921659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.921732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.921790 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:55Z","lastTransitionTime":"2026-02-21T21:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.935707 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 07:32:18.623561328 +0000 UTC Feb 21 21:46:55 crc kubenswrapper[4717]: I0221 21:46:55.994929 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.019684 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.024574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.024636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.024658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.024686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.024708 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.032524 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.050035 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.064813 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.087393 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.108479 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.123948 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.128137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.128194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.128213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.128240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.128260 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.149160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.172040 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.194690 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.215469 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.231449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.231498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.231512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.231536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.231551 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.232916 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.233478 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.233642 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.251236 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.302579 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.309273 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.319894 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.335328 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.335980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.336011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.336023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.336039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.336051 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.413485 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.430339 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.442126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.442178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.442193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.442219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.442235 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.457145 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.497482 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.512189 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.524315 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.541272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.545552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.545603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.545619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.545637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.545648 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.556449 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.568743 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.587168 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.598958 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.611385 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.624829 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.640948 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.648317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.648353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.648365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.648381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.648395 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.656784 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.677298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.693607 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.707398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.720726 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.742162 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.756510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.756580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.756597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.756618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.756632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.758910 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.786033 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.825725 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.844744 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.859852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.859924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.859943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.859966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.859982 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.868807 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.884183 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.936819 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 12:56:54.813251105 +0000 UTC Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.962699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.962951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.963045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.963122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.963187 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:56Z","lastTransitionTime":"2026-02-21T21:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.976131 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:56 crc kubenswrapper[4717]: E0221 21:46:56.976490 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.976174 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:56 crc kubenswrapper[4717]: E0221 21:46:56.976681 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:56 crc kubenswrapper[4717]: I0221 21:46:56.976131 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:56 crc kubenswrapper[4717]: E0221 21:46:56.976851 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.065656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.065720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.065738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.065765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.065782 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.169186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.169236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.169248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.169270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.169284 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.236260 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.273449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.273499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.273512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.273532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.273547 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.376716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.376769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.376785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.376811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.376831 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.481051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.481120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.481144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.481174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.481195 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.585037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.585108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.585124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.585152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.585171 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.690312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.690431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.690450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.690480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.690502 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.794047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.794145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.794169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.794207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.794234 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.897995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.898067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.898085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.898111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.898130 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:57Z","lastTransitionTime":"2026-02-21T21:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:57 crc kubenswrapper[4717]: I0221 21:46:57.937815 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 11:11:16.692163142 +0000 UTC Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.000414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.000473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.000494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.000516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.000535 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.104016 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.104086 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.104105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.104135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.104154 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.208351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.208422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.208441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.208469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.208491 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.242943 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/0.log" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.247540 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77" exitCode=1 Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.247609 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.248905 4717 scope.go:117] "RemoveContainer" containerID="2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.284679 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:57Z\\\",\\\"message\\\":\\\"57.554323 6019 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0221 21:46:57.554434 6019 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554554 6019 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0221 21:46:57.554594 6019 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554705 6019 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554907 6019 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.555416 6019 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 21:46:57.555824 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0221 21:46:57.555891 6019 factory.go:656] Stopping watch factory\\\\nI0221 21:46:57.555911 6019 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:46:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.310816 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.316724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.318283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.318333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.318365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.318382 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.341467 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.366241 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.389130 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.416551 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.422951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.423004 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.423023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.423051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.423069 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.434561 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.453038 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.479158 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.503065 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.524426 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.527411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.527436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.527446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.527461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.527471 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.539208 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.554333 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.572662 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:58Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.631541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.631614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.631632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.631658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.631676 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.735126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.735286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.735309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.735340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.735371 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.839017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.839082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.839103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.839131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.839149 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.938955 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 16:20:41.977613653 +0000 UTC Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.941738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.941781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.941806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.941825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.941837 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:58Z","lastTransitionTime":"2026-02-21T21:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.976349 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.976409 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:46:58 crc kubenswrapper[4717]: I0221 21:46:58.976371 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:46:58 crc kubenswrapper[4717]: E0221 21:46:58.976539 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:46:58 crc kubenswrapper[4717]: E0221 21:46:58.976715 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:46:58 crc kubenswrapper[4717]: E0221 21:46:58.976910 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.044369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.044406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.044416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.044431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.044439 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.147723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.147764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.147775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.147794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.147806 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.250229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.250287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.250300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.250322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.250337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.252895 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/0.log" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.256139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.256332 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.274765 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.292827 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.319111 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.341726 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.353614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.353651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.353661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.353675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.353685 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.359998 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.377786 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.397739 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.429467 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:57Z\\\",\\\"message\\\":\\\"57.554323 6019 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0221 21:46:57.554434 6019 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554554 6019 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0221 21:46:57.554594 6019 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554705 6019 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554907 6019 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.555416 6019 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 21:46:57.555824 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0221 21:46:57.555891 6019 factory.go:656] Stopping watch factory\\\\nI0221 21:46:57.555911 6019 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:46:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.444979 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.457602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.457635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.457646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.457671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.457682 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.468257 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.483973 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.496648 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.511450 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.525414 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:46:59Z is after 2025-08-24T17:21:41Z" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.561330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.561430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.561444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.561466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.561482 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.665111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.665176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.665189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.665213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.665227 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.779426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.779481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.779494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.779515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.779530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.882042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.882106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.882123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.882144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.882156 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.939567 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 02:19:54.110893489 +0000 UTC Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.984391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.984435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.984447 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.984464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:46:59 crc kubenswrapper[4717]: I0221 21:46:59.984477 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:46:59Z","lastTransitionTime":"2026-02-21T21:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.087755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.087843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.087903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.087932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.087951 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.190840 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.190914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.190930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.190949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.190963 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.264144 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/1.log" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.265342 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/0.log" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.270087 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768" exitCode=1 Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.270195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.270342 4717 scope.go:117] "RemoveContainer" containerID="2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.271460 4717 scope.go:117] "RemoveContainer" containerID="586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768" Feb 21 21:47:00 crc kubenswrapper[4717]: E0221 21:47:00.271761 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.294072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.294150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.294179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.294209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.294246 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.298715 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.317751 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.344029 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.363259 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.386193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.397665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.397745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.397766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.397793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.397813 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.405277 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.423002 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.445055 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.480914 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2aadd4bc35404cecbbe47e5ce98967391d446c9560f2cca6bed5e99314790f77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:57Z\\\",\\\"message\\\":\\\"57.554323 6019 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0221 21:46:57.554434 6019 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554554 6019 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0221 21:46:57.554594 6019 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554705 6019 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.554907 6019 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0221 21:46:57.555416 6019 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0221 21:46:57.555824 6019 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0221 21:46:57.555891 6019 factory.go:656] Stopping watch factory\\\\nI0221 21:46:57.555911 6019 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:46:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.501807 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.501910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.501933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.501961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.501981 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.505280 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.528493 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.549923 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.570735 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.592650 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:00Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.605438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.605492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.605510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.605534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.605550 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.708581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.708667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.708693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.708730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.708761 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.812518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.812576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.812601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.812628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.812649 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.915672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.915729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.915741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.915768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.915785 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:00Z","lastTransitionTime":"2026-02-21T21:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.940150 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 07:55:55.382977477 +0000 UTC Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.975726 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.975776 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:00 crc kubenswrapper[4717]: E0221 21:47:00.976058 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:00 crc kubenswrapper[4717]: E0221 21:47:00.976290 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:00 crc kubenswrapper[4717]: I0221 21:47:00.976338 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:00 crc kubenswrapper[4717]: E0221 21:47:00.976527 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.019212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.019260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.019270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.019287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.019299 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.122994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.123053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.123064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.123082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.123097 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.227055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.227124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.227142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.227170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.227190 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.277993 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/1.log" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.285053 4717 scope.go:117] "RemoveContainer" containerID="586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768" Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.285250 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.306825 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.325504 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.330735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.330770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.330783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.330804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.330817 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.346166 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.363058 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.378526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.399493 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.423640 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.433940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.434010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.434037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.434070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.434090 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.440061 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.461719 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.480223 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.500618 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.500842 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.500909 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.501009 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:17.500981391 +0000 UTC m=+52.282515053 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.501812 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.502594 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv"] Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.504181 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:17.504151421 +0000 UTC m=+52.285685073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.504502 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.505001 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.508107 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.508508 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.536010 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.537899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.538009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.538065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.538096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.538118 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.557246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.583749 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.602492 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.602683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.602792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.602855 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rtmd\" (UniqueName: \"kubernetes.io/projected/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-kube-api-access-6rtmd\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.602958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.603051 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.603110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603324 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603379 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603408 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603491 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:17.603468043 +0000 UTC m=+52.385001695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603590 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:17.603576925 +0000 UTC m=+52.385110577 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603713 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603734 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603750 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:01 crc kubenswrapper[4717]: E0221 21:47:01.603797 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:17.603780081 +0000 UTC m=+52.385313733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.606091 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.629517 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.641214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.641298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.641317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.641343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.641362 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.654275 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.675323 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.693613 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.704478 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.704731 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.704963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.705172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rtmd\" (UniqueName: \"kubernetes.io/projected/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-kube-api-access-6rtmd\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.705966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.706067 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.712149 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.723531 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.727000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rtmd\" (UniqueName: \"kubernetes.io/projected/287b67d9-4a4e-4dcc-9723-70c8ac00c1ab-kube-api-access-6rtmd\") pod \"ovnkube-control-plane-749d76644c-m58jv\" (UID: \"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.734298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.744996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.745037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.745051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.745076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.745094 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.753237 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.780125 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.794291 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.817998 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.833135 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.838923 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.849974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.850030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.850054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.850086 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.850141 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:01 crc kubenswrapper[4717]: W0221 21:47:01.852700 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod287b67d9_4a4e_4dcc_9723_70c8ac00c1ab.slice/crio-d0890afe4d2eb43cfa2e2206d4c6930e639396df6160a4cf50f4da418f5b542a WatchSource:0}: Error finding container d0890afe4d2eb43cfa2e2206d4c6930e639396df6160a4cf50f4da418f5b542a: Status 404 returned error can't find the container with id d0890afe4d2eb43cfa2e2206d4c6930e639396df6160a4cf50f4da418f5b542a Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.863339 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.887097 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.906330 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:01Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.940346 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:53:28.565862431 +0000 UTC Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.953579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.953626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.953637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.953658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:01 crc kubenswrapper[4717]: I0221 21:47:01.953670 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:01Z","lastTransitionTime":"2026-02-21T21:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.057276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.057333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.057344 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.057366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.057378 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.160054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.160117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.160140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.160171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.160197 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.267668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.267773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.267791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.267841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.267886 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.296766 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" event={"ID":"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab","Type":"ContainerStarted","Data":"f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.296842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" event={"ID":"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab","Type":"ContainerStarted","Data":"e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.296857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" event={"ID":"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab","Type":"ContainerStarted","Data":"d0890afe4d2eb43cfa2e2206d4c6930e639396df6160a4cf50f4da418f5b542a"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.312638 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.334773 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.351116 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.352123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.352170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.352186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.352211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.352227 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.368303 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.373453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.373518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.373534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.373559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.373573 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.374798 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.395281 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.398036 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.402979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.403024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.403037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.403055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.403066 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.415243 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.424004 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.432637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.432696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.432712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.432782 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.432824 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.445831 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.450479 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.454215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.454261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.454275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.454299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.454313 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.457623 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.469232 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.469376 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.471370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.471430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.471449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.471472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.471489 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.474334 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.509725 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.555009 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.573973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.574026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.574042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.574061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.574072 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.580770 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.596569 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.610917 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.629480 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.654524 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gt2bg"] Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.655160 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.655240 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.668839 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.676560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.676642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.676657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.676677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.676687 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.682426 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.695891 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.708589 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.714893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.714959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2v5\" (UniqueName: \"kubernetes.io/projected/8203b79d-1367-43b6-8567-797ec1b0c09b-kube-api-access-wz2v5\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.729424 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.761494 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.775930 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.779901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.779960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.779980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.780009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.780027 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.794078 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.808186 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.816231 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2v5\" (UniqueName: \"kubernetes.io/projected/8203b79d-1367-43b6-8567-797ec1b0c09b-kube-api-access-wz2v5\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.816356 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.816523 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.816624 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:03.316595682 +0000 UTC m=+38.098129314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.822562 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.836336 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2v5\" (UniqueName: \"kubernetes.io/projected/8203b79d-1367-43b6-8567-797ec1b0c09b-kube-api-access-wz2v5\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.838469 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.854164 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.868794 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.883118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.883155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.883167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.883184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.883197 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.889680 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.903163 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.913817 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:02Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.940646 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 12:42:44.629966537 +0000 UTC Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.976001 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.976107 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.976395 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.976451 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.976491 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:02 crc kubenswrapper[4717]: E0221 21:47:02.976531 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.986033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.986061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.986072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.986085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:02 crc kubenswrapper[4717]: I0221 21:47:02.986096 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:02Z","lastTransitionTime":"2026-02-21T21:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.090325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.090356 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.090369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.090388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.090398 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.194057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.194130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.194150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.194180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.194198 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.298081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.298148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.298167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.298194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.298213 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.322326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:03 crc kubenswrapper[4717]: E0221 21:47:03.322616 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:03 crc kubenswrapper[4717]: E0221 21:47:03.322747 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:04.32271235 +0000 UTC m=+39.104246072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.401842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.401955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.401978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.402005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.402023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.505242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.505321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.505339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.505370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.505388 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.608509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.608608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.608628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.608654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.608672 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.712269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.712349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.712368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.712393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.712411 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.816366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.816428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.816441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.816464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.816480 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.920488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.920572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.920592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.920619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.920639 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:03Z","lastTransitionTime":"2026-02-21T21:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.941317 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 18:46:53.076768922 +0000 UTC Feb 21 21:47:03 crc kubenswrapper[4717]: I0221 21:47:03.976025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:03 crc kubenswrapper[4717]: E0221 21:47:03.976223 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.023087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.023136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.023147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.023164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.023174 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.125770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.125910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.125937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.125970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.125994 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.229997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.230061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.230079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.230106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.230123 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.333486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.333550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.333567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.333591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.333609 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.334541 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:04 crc kubenswrapper[4717]: E0221 21:47:04.334753 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:04 crc kubenswrapper[4717]: E0221 21:47:04.334843 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:06.334819865 +0000 UTC m=+41.116353517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.444322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.444396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.444416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.444444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.444462 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.548390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.548452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.548468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.548491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.548508 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.652601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.652688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.652709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.652744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.652768 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.756486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.756563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.756582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.756609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.756629 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.859583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.859666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.859684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.859710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.859728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.942143 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:45:47.663467777 +0000 UTC Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.963082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.963149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.963172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.963198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.963216 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:04Z","lastTransitionTime":"2026-02-21T21:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.975438 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.975602 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:04 crc kubenswrapper[4717]: I0221 21:47:04.975595 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:04 crc kubenswrapper[4717]: E0221 21:47:04.975818 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:04 crc kubenswrapper[4717]: E0221 21:47:04.976001 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:04 crc kubenswrapper[4717]: E0221 21:47:04.976104 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.067080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.067160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.067184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.067216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.067239 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.170027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.170102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.170127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.170157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.170177 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.273405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.273470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.273489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.273515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.273537 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.377240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.377307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.377319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.377341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.377359 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.480914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.480964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.480981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.481004 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.481022 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.584601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.584666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.584679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.584709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.584725 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.687468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.687510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.687519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.687534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.687545 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.791655 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.791729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.791752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.791779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.791805 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.895023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.895128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.895148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.895172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.895189 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.942839 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 10:44:37.735798314 +0000 UTC Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.976172 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:05 crc kubenswrapper[4717]: E0221 21:47:05.976413 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.998249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.998320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.998337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.998364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.998381 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:05Z","lastTransitionTime":"2026-02-21T21:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:05 crc kubenswrapper[4717]: I0221 21:47:05.998426 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:05Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.021190 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.042304 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.066275 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.092538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.107454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.107513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.107687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.107854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.107931 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.113942 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.135947 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.154198 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.173232 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.191289 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.211130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.211219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.211233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.211255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.211272 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.229941 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.251022 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.267575 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.287585 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.303705 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.314037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.314079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.314101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.314122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.314137 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.322652 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.359677 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:06 crc kubenswrapper[4717]: E0221 21:47:06.359982 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:06 crc kubenswrapper[4717]: E0221 21:47:06.360126 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:10.360090182 +0000 UTC m=+45.141623844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.417095 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.417161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.417181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.417207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.417228 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.520933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.521014 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.521034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.521059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.521082 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.625241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.625300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.625319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.625347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.625364 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.728756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.728830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.728849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.728901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.728922 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.832324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.832397 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.832423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.832459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.832478 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.935976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.936047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.936066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.936091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.936108 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:06Z","lastTransitionTime":"2026-02-21T21:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.943905 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:45:15.906985974 +0000 UTC Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.975908 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.975946 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:06 crc kubenswrapper[4717]: E0221 21:47:06.976149 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:06 crc kubenswrapper[4717]: E0221 21:47:06.976299 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:06 crc kubenswrapper[4717]: I0221 21:47:06.976507 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:06 crc kubenswrapper[4717]: E0221 21:47:06.976802 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.038939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.039007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.039024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.039047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.039066 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.142327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.142389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.142413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.142445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.142468 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.245349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.245761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.246008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.246190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.246338 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.349559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.349936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.349948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.349967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.349981 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.452553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.452598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.452612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.452632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.452646 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.555682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.556212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.556346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.556480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.556619 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.659239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.659290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.659305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.659328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.659341 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.771361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.771421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.771436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.771457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.771472 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.873976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.874017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.874033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.874054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.874071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.944762 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:42:37.147014034 +0000 UTC Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.975485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:07 crc kubenswrapper[4717]: E0221 21:47:07.975641 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.979342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.979391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.979408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.979428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:07 crc kubenswrapper[4717]: I0221 21:47:07.979445 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:07Z","lastTransitionTime":"2026-02-21T21:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.082351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.082772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.082905 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.083028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.083115 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.187139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.187187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.187199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.187216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.187232 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.291327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.291415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.291442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.291507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.291537 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.394282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.394354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.394371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.394404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.394421 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.497237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.497303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.497324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.497351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.497370 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.599730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.599799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.599821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.599850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.599905 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.703411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.703493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.703513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.703539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.703562 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.807058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.807494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.807650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.807802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.807996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.911788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.911895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.911916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.911940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.911959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:08Z","lastTransitionTime":"2026-02-21T21:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.945552 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 06:11:31.846140976 +0000 UTC Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.976018 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.976019 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:08 crc kubenswrapper[4717]: I0221 21:47:08.976170 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:08 crc kubenswrapper[4717]: E0221 21:47:08.976376 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:08 crc kubenswrapper[4717]: E0221 21:47:08.976534 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:08 crc kubenswrapper[4717]: E0221 21:47:08.976798 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.018276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.018340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.018359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.018385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.018405 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.121977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.122041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.122060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.122095 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.122114 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.225654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.225728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.225749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.225781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.225799 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.329306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.329375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.329394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.329420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.329435 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.433762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.433826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.433845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.433889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.433904 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.536210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.536264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.536275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.536292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.536306 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.639930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.640006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.640032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.640071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.640099 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.743520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.743591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.743611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.743640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.743658 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.848146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.848226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.848246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.848276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.848297 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.946563 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 14:34:03.789381407 +0000 UTC Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.951352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.951413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.951428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.951448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.951461 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:09Z","lastTransitionTime":"2026-02-21T21:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:09 crc kubenswrapper[4717]: I0221 21:47:09.975991 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:09 crc kubenswrapper[4717]: E0221 21:47:09.976255 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.055032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.055102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.055118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.055143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.055162 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.158252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.158307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.158319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.158338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.158354 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.262212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.262318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.262340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.262372 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.262396 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.365919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.365980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.366003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.366033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.366055 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.393933 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.395310 4717 scope.go:117] "RemoveContainer" containerID="586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.406781 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:10 crc kubenswrapper[4717]: E0221 21:47:10.406996 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:10 crc kubenswrapper[4717]: E0221 21:47:10.407079 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:18.407057574 +0000 UTC m=+53.188591236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.470292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.470366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.470387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.470418 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.470441 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.573594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.573656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.573674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.573696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.573710 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.676981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.677054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.677075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.677104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.677131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.779838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.779928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.779945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.779967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.779984 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.882823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.882923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.882983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.883015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.883035 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.947114 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 11:48:27.873588617 +0000 UTC Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.975468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.975661 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:10 crc kubenswrapper[4717]: E0221 21:47:10.975781 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.975876 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:10 crc kubenswrapper[4717]: E0221 21:47:10.975991 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:10 crc kubenswrapper[4717]: E0221 21:47:10.976106 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.985555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.985607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.985627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.985665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:10 crc kubenswrapper[4717]: I0221 21:47:10.985707 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:10Z","lastTransitionTime":"2026-02-21T21:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.088363 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.088412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.088427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.088451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.088467 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.191461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.191513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.191525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.191545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.191558 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.295375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.295440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.295454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.295475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.295487 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.338040 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/1.log" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.341521 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.341912 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.367913 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.394917 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.398555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.398639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.398662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.398695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.398720 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.414455 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.435193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.470609 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.488699 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.501293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.501340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.501351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.501367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.501379 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.503263 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.520394 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.535510 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.548750 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.566531 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.589909 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.604139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.604177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.604186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.604203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.604214 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.606152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.622354 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.638016 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.655645 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:11Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.707198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.707281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.707303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.707335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.707358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.811543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.811631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.811659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.811685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.811703 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.914659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.914721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.914733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.914751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.914761 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:11Z","lastTransitionTime":"2026-02-21T21:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.948092 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 09:25:32.159734025 +0000 UTC Feb 21 21:47:11 crc kubenswrapper[4717]: I0221 21:47:11.975470 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:11 crc kubenswrapper[4717]: E0221 21:47:11.975625 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.017850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.017911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.017922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.017937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.017947 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.121263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.121327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.121346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.121372 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.121390 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.224594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.224697 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.224723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.224756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.224782 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.328423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.328503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.328524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.328550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.328569 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.347916 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/2.log" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.349035 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/1.log" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.353999 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d" exitCode=1 Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.354075 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.354152 4717 scope.go:117] "RemoveContainer" containerID="586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.355292 4717 scope.go:117] "RemoveContainer" containerID="651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d" Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.355614 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.378963 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.400443 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.417413 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.431919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.432035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.432056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.432082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.432102 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.441342 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.459508 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.472903 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.496076 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.523651 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://586745e033a202641042d8e1e423983ec31e98dccc87ae7f2fe426daab001768\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:46:59Z\\\",\\\"message\\\":\\\"e-identity/network-node-identity-vrzqb\\\\nI0221 21:46:59.315913 6161 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0221 21:46:59.315959 6161 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0221 21:46:59.315959 6161 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.537268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.537359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.537378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.541022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.541098 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.541745 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.558305 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.575897 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.591025 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.606510 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.619126 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.636019 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.640096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.640136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.640153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.640177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.640194 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.653443 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.657110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.657164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.657185 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.657208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.657224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.659680 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.677026 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.681357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.681434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.681460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.681490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.681508 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.701137 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.706212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.706276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.706297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.706325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.706342 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.725558 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.729999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.730054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.730072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.730096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.730114 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.748845 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:12Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.749155 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.751236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.751282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.751296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.751316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.751333 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.854622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.854671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.854684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.854709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.854728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.948409 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 17:08:48.671410278 +0000 UTC Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.957845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.957934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.957963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.957996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.958021 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:12Z","lastTransitionTime":"2026-02-21T21:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.976402 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.976433 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.976614 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:12 crc kubenswrapper[4717]: I0221 21:47:12.976644 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.976785 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:12 crc kubenswrapper[4717]: E0221 21:47:12.976940 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.062155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.062325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.062364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.062409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.062434 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.166325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.166501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.166530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.166558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.166577 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.270577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.270657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.270676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.270702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.270720 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.362262 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/2.log" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.369378 4717 scope.go:117] "RemoveContainer" containerID="651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d" Feb 21 21:47:13 crc kubenswrapper[4717]: E0221 21:47:13.369814 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.374686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.375173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.375210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.375247 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.375276 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.393298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.411398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.433747 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.455083 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.476592 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.478443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.478488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.478501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.478523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.478540 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.496066 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.510451 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.527540 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.552601 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.571192 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.601679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.601739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.601753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.601774 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.601786 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.613162 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.652585 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.666419 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.677054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.690074 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.704479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.704526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.704538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.704557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.704569 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.705254 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:13Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.807547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.807619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.807638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.807669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.807694 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.911477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.911560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.911580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.911613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.911633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:13Z","lastTransitionTime":"2026-02-21T21:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.949528 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 07:12:14.108243894 +0000 UTC Feb 21 21:47:13 crc kubenswrapper[4717]: I0221 21:47:13.975443 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:13 crc kubenswrapper[4717]: E0221 21:47:13.975716 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.014147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.014209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.014223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.014244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.014258 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.118410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.118505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.118526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.118562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.118585 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.221656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.221726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.221746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.221793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.221814 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.324492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.324568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.324588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.324617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.324635 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.428460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.428559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.428579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.428611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.428633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.532184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.532252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.532274 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.532310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.532333 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.635943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.636030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.636053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.636448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.636672 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.740102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.740168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.740187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.740213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.740231 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.867910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.867975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.867995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.868019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.868037 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.950556 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 12:49:36.105299792 +0000 UTC Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.973681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.973756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.973776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.973805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.973835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:14Z","lastTransitionTime":"2026-02-21T21:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.975323 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:14 crc kubenswrapper[4717]: E0221 21:47:14.975519 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.975652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:14 crc kubenswrapper[4717]: I0221 21:47:14.975803 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:14 crc kubenswrapper[4717]: E0221 21:47:14.976058 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:14 crc kubenswrapper[4717]: E0221 21:47:14.976538 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.078186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.078261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.078280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.078310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.078333 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.181786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.181849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.181893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.181915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.181928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.284998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.285050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.285065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.285093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.285108 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.390089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.390194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.390215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.390320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.390342 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.494555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.494614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.494634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.494659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.494677 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.599057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.599128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.599155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.599182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.599200 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.704191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.704266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.704315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.704342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.704358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.808629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.808688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.808708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.808735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.808754 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.911913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.911980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.911999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.912023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.912040 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:15Z","lastTransitionTime":"2026-02-21T21:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.950841 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:25:14.918894212 +0000 UTC Feb 21 21:47:15 crc kubenswrapper[4717]: I0221 21:47:15.976408 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:15 crc kubenswrapper[4717]: E0221 21:47:15.976668 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.003078 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:15Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.015307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.015395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.015416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.015446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.015466 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.028500 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.048716 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.076247 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.110471 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.118445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.118507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.118528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.118554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.118574 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.133409 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.134013 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.150280 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.152521 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.167683 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.189741 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.204882 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.218361 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.223315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.223369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.223381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.223398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.223409 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.237793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.279236 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.297229 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.315016 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.328027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.328124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.328153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.328236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.328310 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.341235 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.357991 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.370066 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.381879 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.396278 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.410985 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.423950 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.430553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.430589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.430602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.430617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.430627 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.438517 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.454943 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.471146 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.487580 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.505950 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.531252 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.533443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.533481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.533495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.533517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.533530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.547439 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.562380 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.576228 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.596408 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.612576 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:16Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.637946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.638198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.638324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.638436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.638535 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.741978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.742381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.742498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.742768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.742928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.846697 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.846780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.846805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.846836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.846890 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.951513 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:10:17.785590862 +0000 UTC Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.952911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.953115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.953264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.953442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.953630 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:16Z","lastTransitionTime":"2026-02-21T21:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.975342 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.975391 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:16 crc kubenswrapper[4717]: I0221 21:47:16.975638 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:16 crc kubenswrapper[4717]: E0221 21:47:16.975934 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:16 crc kubenswrapper[4717]: E0221 21:47:16.976210 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:16 crc kubenswrapper[4717]: E0221 21:47:16.976399 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.059855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.059987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.060016 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.060054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.060083 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.164073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.164147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.164168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.164194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.164212 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.268278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.268330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.268351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.268378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.268399 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.408048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.408106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.408119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.408143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.408157 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.502034 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.502280 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.502454 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:49.502415867 +0000 UTC m=+84.283949589 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.512218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.512271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.512287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.512308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.512322 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.603763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.603972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.604062 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.604221 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:49.60418727 +0000 UTC m=+84.385720932 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.604270 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.604319 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.604342 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.604457 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:49.604415996 +0000 UTC m=+84.385949818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.616187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.616259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.616287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.616327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.616355 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.705193 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.705369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.705427 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:49.70537926 +0000 UTC m=+84.486912912 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.705560 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.705589 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.705609 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.705696 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:47:49.705670637 +0000 UTC m=+84.487204289 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.720492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.720552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.720572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.720602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.720622 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.823783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.823844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.823898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.823926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.823945 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.927263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.927346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.927365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.927399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.927418 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:17Z","lastTransitionTime":"2026-02-21T21:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.952745 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 10:58:20.039760411 +0000 UTC Feb 21 21:47:17 crc kubenswrapper[4717]: I0221 21:47:17.975497 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:17 crc kubenswrapper[4717]: E0221 21:47:17.975755 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.030664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.030736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.030756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.030787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.030807 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.135008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.135062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.135079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.135109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.135127 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.238987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.239046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.239064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.239091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.239109 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.343189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.343282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.343303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.343329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.343350 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.416069 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:18 crc kubenswrapper[4717]: E0221 21:47:18.416298 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:18 crc kubenswrapper[4717]: E0221 21:47:18.416443 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:47:34.41641319 +0000 UTC m=+69.197946852 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.446516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.446604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.446629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.446662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.446688 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.549840 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.549945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.549970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.550000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.550028 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.652849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.652959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.652978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.653004 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.653022 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.756692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.756757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.756775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.756802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.756823 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.861037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.861139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.861153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.861176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.861194 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.954015 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 16:52:09.712507684 +0000 UTC Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.964623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.964703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.964723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.964750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.964768 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:18Z","lastTransitionTime":"2026-02-21T21:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.976209 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.976236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:18 crc kubenswrapper[4717]: I0221 21:47:18.976276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:18 crc kubenswrapper[4717]: E0221 21:47:18.976399 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:18 crc kubenswrapper[4717]: E0221 21:47:18.976546 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:18 crc kubenswrapper[4717]: E0221 21:47:18.976806 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.068765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.068835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.068854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.068914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.068938 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.174955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.175350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.175503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.175650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.175922 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.279044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.279380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.279467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.279563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.279664 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.384237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.384324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.384350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.384375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.384396 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.493029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.493663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.493682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.493712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.493733 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.597467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.597510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.597529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.597553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.597572 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.700410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.700579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.700605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.700629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.700645 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.803767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.803827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.803850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.803917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.803942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.906913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.906983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.907008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.907038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.907065 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:19Z","lastTransitionTime":"2026-02-21T21:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.955044 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:46:25.160322168 +0000 UTC Feb 21 21:47:19 crc kubenswrapper[4717]: I0221 21:47:19.975619 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:19 crc kubenswrapper[4717]: E0221 21:47:19.975933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.010099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.010200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.010218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.010243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.010305 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.113850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.113896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.113904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.113919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.113928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.217255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.217297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.217315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.217335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.217351 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.320305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.320398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.320421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.320453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.320477 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.423489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.423553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.423571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.423597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.423619 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.527801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.527932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.527953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.527985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.528004 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.631352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.631434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.631461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.631503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.631530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.735083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.735178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.735213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.735253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.735275 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.838496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.838567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.838587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.838615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.838635 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.942183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.942240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.942263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.942299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.942319 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:20Z","lastTransitionTime":"2026-02-21T21:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.955962 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 13:34:56.833139922 +0000 UTC Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.976317 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.976428 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:20 crc kubenswrapper[4717]: I0221 21:47:20.976459 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:20 crc kubenswrapper[4717]: E0221 21:47:20.976545 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:20 crc kubenswrapper[4717]: E0221 21:47:20.976689 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:20 crc kubenswrapper[4717]: E0221 21:47:20.976962 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.045370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.045445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.045470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.045499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.045519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.148780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.148854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.148910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.148936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.148964 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.251825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.251940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.251960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.251989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.252008 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.355832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.355986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.356012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.356047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.356070 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.459563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.459631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.459649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.459674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.459692 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.562926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.562988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.563006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.563032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.563052 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.666845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.666959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.666979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.667005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.667023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.770571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.770646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.770665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.770691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.770710 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.874130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.874196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.874215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.874238 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.874252 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.956803 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 04:51:36.559901565 +0000 UTC Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.975801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:21 crc kubenswrapper[4717]: E0221 21:47:21.976039 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.978071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.978150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.978176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.978209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:21 crc kubenswrapper[4717]: I0221 21:47:21.978233 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:21Z","lastTransitionTime":"2026-02-21T21:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.081394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.081449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.081462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.081486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.081501 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.185480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.185563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.185586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.185615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.185636 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.289884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.289936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.289950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.289972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.289989 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.394088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.394159 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.394177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.394206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.394224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.497553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.497635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.497661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.497702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.497728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.601520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.601597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.601617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.601649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.601672 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.705180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.705235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.705252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.705280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.705299 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.808636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.808716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.808737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.808767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.808787 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.912158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.912244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.912263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.912289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.912308 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:22Z","lastTransitionTime":"2026-02-21T21:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.957940 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:54:17.698735997 +0000 UTC Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.976412 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.976524 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:22 crc kubenswrapper[4717]: I0221 21:47:22.976633 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:22 crc kubenswrapper[4717]: E0221 21:47:22.977006 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:22 crc kubenswrapper[4717]: E0221 21:47:22.977245 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:22 crc kubenswrapper[4717]: E0221 21:47:22.977331 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.015505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.015576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.015601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.015629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.015646 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.049010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.049082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.049103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.049142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.049171 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.072585 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:23Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.086038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.086129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.086153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.086204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.086232 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.112286 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:23Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.117519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.117768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.117953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.118118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.118265 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.139313 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:23Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.144218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.144306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.144330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.144358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.144377 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.170500 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:23Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.175805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.175898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.175919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.175944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.175967 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.197435 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:23Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.197660 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.199915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.200035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.200070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.200108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.200136 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.303142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.303333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.303358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.303425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.303449 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.407466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.407542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.407564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.407600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.407625 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.510357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.510416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.510433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.510457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.510475 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.613632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.613687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.613705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.613729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.613747 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.717158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.717231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.717252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.717279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.717300 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.819980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.820044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.820062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.820089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.820108 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.925263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.925378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.925402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.925438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.925463 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:23Z","lastTransitionTime":"2026-02-21T21:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.958706 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 03:51:17.514051527 +0000 UTC Feb 21 21:47:23 crc kubenswrapper[4717]: I0221 21:47:23.976384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:23 crc kubenswrapper[4717]: E0221 21:47:23.976600 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.029698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.029764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.029777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.029794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.029807 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.133259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.133328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.133346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.133373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.133391 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.237048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.237110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.237126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.237146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.237159 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.339632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.339747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.339780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.339819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.339845 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.442808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.442851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.442883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.442903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.442916 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.546313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.546381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.546400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.546430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.546448 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.649836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.649952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.649972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.650005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.650031 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.758785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.758852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.758901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.758928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.758947 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.862762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.862847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.862906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.862940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.862964 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.959279 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 02:44:08.652583306 +0000 UTC Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.966821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.966939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.966961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.966986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.967005 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:24Z","lastTransitionTime":"2026-02-21T21:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.976361 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.976445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.976372 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:24 crc kubenswrapper[4717]: E0221 21:47:24.976630 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:24 crc kubenswrapper[4717]: E0221 21:47:24.977019 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:24 crc kubenswrapper[4717]: E0221 21:47:24.977679 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:24 crc kubenswrapper[4717]: I0221 21:47:24.978207 4717 scope.go:117] "RemoveContainer" containerID="651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d" Feb 21 21:47:24 crc kubenswrapper[4717]: E0221 21:47:24.978536 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.069769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.069828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.069844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.069892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.069911 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.173241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.173301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.173319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.173341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.173358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.276677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.276750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.276775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.276802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.276826 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.380220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.380279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.380299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.380321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.380338 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.483460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.483521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.483545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.483574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.483596 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.586431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.586513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.586531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.586555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.586573 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.691028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.691121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.691146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.691177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.691201 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.794884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.794961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.794978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.795006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.795025 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.897465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.897505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.897516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.897532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.897545 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:25Z","lastTransitionTime":"2026-02-21T21:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.960356 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:03:48.315383044 +0000 UTC Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.975744 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:25 crc kubenswrapper[4717]: E0221 21:47:25.977112 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:25 crc kubenswrapper[4717]: I0221 21:47:25.997797 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:25Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.000732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.000812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.000887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.000920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.000940 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.016192 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.040707 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.064084 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.080492 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.099964 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.104244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.104280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.104296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.104319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.104334 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.124791 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.140096 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.156399 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.171628 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.186799 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.206941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.207040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.207057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.207077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.207090 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.210658 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.242620 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.256447 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.274039 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.288328 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.303746 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:26Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.309904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.309950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.309968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.310086 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.310106 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.413314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.413400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.413423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.413447 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.413465 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.516734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.516832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.516852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.516921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.516944 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.620736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.620806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.620823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.620851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.620903 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.724465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.724557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.724577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.724607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.724629 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.827526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.827593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.827611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.827634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.827652 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.931072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.931138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.931158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.931182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.931200 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:26Z","lastTransitionTime":"2026-02-21T21:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.960825 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 11:25:54.477750348 +0000 UTC Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.975923 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.975968 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:26 crc kubenswrapper[4717]: I0221 21:47:26.975995 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:26 crc kubenswrapper[4717]: E0221 21:47:26.976123 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:26 crc kubenswrapper[4717]: E0221 21:47:26.976240 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:26 crc kubenswrapper[4717]: E0221 21:47:26.976409 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.034761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.034825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.034844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.034900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.034920 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.138557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.138631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.138650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.138677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.138696 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.243203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.243301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.243324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.243358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.243382 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.346996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.347058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.347075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.347097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.347116 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.449844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.449934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.449954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.449977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.449995 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.553698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.553912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.553942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.553986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.554010 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.657639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.657698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.657717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.657742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.657762 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.760788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.760853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.760905 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.760947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.760980 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.864786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.864890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.864908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.864936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.864957 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.961420 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:04:14.165660676 +0000 UTC Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.968214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.968321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.968346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.968375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.968398 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:27Z","lastTransitionTime":"2026-02-21T21:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:27 crc kubenswrapper[4717]: I0221 21:47:27.975736 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:27 crc kubenswrapper[4717]: E0221 21:47:27.976007 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.071349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.071413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.071435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.071457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.071474 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.174773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.174848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.174893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.174923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.174942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.278800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.278939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.278963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.278995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.279020 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.382264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.382743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.382767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.382794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.382817 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.485651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.485728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.485743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.485769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.485787 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.589056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.589115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.589131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.589153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.589169 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.693425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.693490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.693503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.693521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.693542 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.797304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.797401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.797421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.797452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.797471 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.901158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.901215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.901233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.901259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.901277 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:28Z","lastTransitionTime":"2026-02-21T21:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.962593 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 11:29:13.87830656 +0000 UTC Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.976136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:28 crc kubenswrapper[4717]: E0221 21:47:28.976328 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.976625 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:28 crc kubenswrapper[4717]: E0221 21:47:28.976769 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:28 crc kubenswrapper[4717]: I0221 21:47:28.977169 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:28 crc kubenswrapper[4717]: E0221 21:47:28.977289 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.005332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.005408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.005436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.005469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.005498 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.109114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.109175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.109192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.109218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.109235 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.213194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.213227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.213236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.213249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.213260 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.316939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.317063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.317079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.317126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.317139 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.420068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.420134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.420153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.420183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.420206 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.523216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.523290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.523313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.523342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.523361 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.626585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.626672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.626695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.626730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.626753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.730358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.730430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.730451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.730476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.730493 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.833050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.833180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.833192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.833208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.833220 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.935746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.935798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.935808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.935825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.935835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:29Z","lastTransitionTime":"2026-02-21T21:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.963245 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 07:55:43.573312674 +0000 UTC Feb 21 21:47:29 crc kubenswrapper[4717]: I0221 21:47:29.975969 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:29 crc kubenswrapper[4717]: E0221 21:47:29.976158 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.038628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.038673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.038690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.038712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.038729 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.141278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.141344 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.141362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.141387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.141441 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.244769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.244835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.244853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.244905 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.244926 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.347838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.347901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.347910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.347929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.347948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.450784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.450856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.450903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.450929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.450947 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.554135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.554193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.554210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.554232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.554305 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.657587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.657667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.657719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.657748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.657765 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.760988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.761052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.761070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.761097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.761116 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.864236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.864288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.864305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.864331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.864353 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.963414 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 18:46:59.983397893 +0000 UTC Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.968961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.969039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.969053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.969073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.969090 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:30Z","lastTransitionTime":"2026-02-21T21:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.976261 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.976273 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:30 crc kubenswrapper[4717]: I0221 21:47:30.976359 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:30 crc kubenswrapper[4717]: E0221 21:47:30.976391 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:30 crc kubenswrapper[4717]: E0221 21:47:30.976580 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:30 crc kubenswrapper[4717]: E0221 21:47:30.976805 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.073674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.073749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.073772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.073806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.073829 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.177783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.177837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.177846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.177886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.177900 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.281698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.281779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.281802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.281830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.281852 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.385958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.386123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.386145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.386175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.386230 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.490516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.490592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.490619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.490648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.490675 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.594299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.594372 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.594410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.594439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.594461 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.697388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.697449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.697465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.697489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.697507 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.799819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.800158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.800235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.800314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.800399 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.903398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.903884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.904052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.904197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.904313 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:31Z","lastTransitionTime":"2026-02-21T21:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.963905 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 05:57:45.601768344 +0000 UTC Feb 21 21:47:31 crc kubenswrapper[4717]: I0221 21:47:31.976272 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:31 crc kubenswrapper[4717]: E0221 21:47:31.976478 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.006929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.006975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.006991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.007009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.007027 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.109687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.109728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.109737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.109750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.109758 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.212574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.212654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.212667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.212689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.212704 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.316002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.316085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.316108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.316143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.316166 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.419581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.419638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.419650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.419670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.419685 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.522684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.522750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.522770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.522798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.522816 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.626582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.626650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.626664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.626683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.626697 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.730391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.730449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.730461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.730481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.730494 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.833784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.833854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.833902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.833929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.833948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.936424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.936459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.936468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.936483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.936496 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:32Z","lastTransitionTime":"2026-02-21T21:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.965112 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 03:33:49.058429029 +0000 UTC Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.975565 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.975644 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:32 crc kubenswrapper[4717]: I0221 21:47:32.975678 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:32 crc kubenswrapper[4717]: E0221 21:47:32.975788 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:32 crc kubenswrapper[4717]: E0221 21:47:32.976005 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:32 crc kubenswrapper[4717]: E0221 21:47:32.976178 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.040092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.040162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.040182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.040210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.040230 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.143015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.143057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.143067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.143082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.143094 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.246114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.246160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.246172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.246189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.246202 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.299488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.304059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.304108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.304132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.304155 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.319383 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:33Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.323762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.323826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.323845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.323897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.323913 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.337496 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:33Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.341715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.341761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.341774 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.341792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.341805 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.359148 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:33Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.363081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.363124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.363135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.363150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.363159 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.379939 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:33Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.384410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.384468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.384478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.384503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.384519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.401950 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:33Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.402078 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.403838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.403898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.403909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.403928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.403939 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.507046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.507153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.507220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.507262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.507296 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.610012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.610058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.610073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.610093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.610107 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.713701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.713753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.713771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.713794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.713808 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.816838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.816924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.816943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.816967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.816988 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.920726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.920825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.920895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.920994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.921013 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:33Z","lastTransitionTime":"2026-02-21T21:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.965725 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 06:03:30.492337169 +0000 UTC Feb 21 21:47:33 crc kubenswrapper[4717]: I0221 21:47:33.976073 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:33 crc kubenswrapper[4717]: E0221 21:47:33.976199 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.023203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.023253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.023266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.023282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.023291 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.126003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.126065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.126088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.126114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.126132 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.229917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.229970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.229981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.230000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.230010 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.332395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.332448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.332460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.332480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.332496 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.420216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:34 crc kubenswrapper[4717]: E0221 21:47:34.420446 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:34 crc kubenswrapper[4717]: E0221 21:47:34.420594 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:06.420569763 +0000 UTC m=+101.202103395 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.435287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.435324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.435336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.435352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.435365 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.538923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.539033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.539051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.539074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.539125 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.641564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.641625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.641639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.641659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.641672 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.744409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.744455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.744467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.744482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.744491 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.847765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.847810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.847822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.847843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.847854 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.951712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.951766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.951784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.951810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.951827 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:34Z","lastTransitionTime":"2026-02-21T21:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.966151 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 13:51:51.637000313 +0000 UTC Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.975622 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.975789 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:34 crc kubenswrapper[4717]: E0221 21:47:34.975988 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:34 crc kubenswrapper[4717]: I0221 21:47:34.976301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:34 crc kubenswrapper[4717]: E0221 21:47:34.976419 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:34 crc kubenswrapper[4717]: E0221 21:47:34.976735 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.054045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.054107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.054125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.054152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.054169 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.170917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.170988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.171008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.171035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.171054 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.273452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.273503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.273518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.273536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.273548 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.376327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.376397 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.376417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.376450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.376468 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.476804 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/0.log" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.476922 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da" containerID="938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994" exitCode=1 Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.476966 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerDied","Data":"938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.477497 4717 scope.go:117] "RemoveContainer" containerID="938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.479626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.479767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.479784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.479810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.479826 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.491962 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.511026 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.529291 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.545179 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.565987 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.582573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.582615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.582626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.582644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.582657 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.587688 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.605342 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.626737 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.649524 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.667918 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.685969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.686034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.686053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.686080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.686095 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.688500 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.702028 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.717705 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.738253 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.753002 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.765930 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.782593 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.788461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.788512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.788527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.788546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.788559 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.892821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.892916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.892937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.892964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.892984 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.967218 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 00:22:33.805920771 +0000 UTC Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.975821 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:35 crc kubenswrapper[4717]: E0221 21:47:35.976044 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.995146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.995243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.995263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.995289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.995308 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:35Z","lastTransitionTime":"2026-02-21T21:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:35 crc kubenswrapper[4717]: I0221 21:47:35.998959 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:35Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.016981 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.036110 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.077233 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.098027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.098079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.098094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.098118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.098133 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.110396 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.139328 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.158071 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.174363 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.190523 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.200922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.200975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.200994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.201020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.201038 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.206653 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.219629 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.232220 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.242027 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.255682 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.280299 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.296145 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.303596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.303629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.303647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.303668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.303685 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.310603 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.406900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.406982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.407003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.407031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.407053 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.483779 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/0.log" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.483907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerStarted","Data":"3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.499900 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.509920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.509985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.510006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.510031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.510049 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.514726 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.530177 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.545525 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.563247 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.593974 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.608892 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.613034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.613081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.613098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.613124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.613145 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.625800 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.637660 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.649530 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.677458 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.699840 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.715751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.715790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.715801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.715819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.715831 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.720197 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.743017 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.764310 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.787528 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.807630 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:36Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.818735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.818783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.818800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.818821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.818837 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.921712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.921775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.921791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.921818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.921848 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:36Z","lastTransitionTime":"2026-02-21T21:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.968341 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:10:02.072013089 +0000 UTC Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.975756 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.975792 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:36 crc kubenswrapper[4717]: I0221 21:47:36.975766 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:36 crc kubenswrapper[4717]: E0221 21:47:36.975925 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:36 crc kubenswrapper[4717]: E0221 21:47:36.975991 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:36 crc kubenswrapper[4717]: E0221 21:47:36.976069 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.024919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.024969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.024982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.025002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.025038 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.128433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.128503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.128522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.128549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.128568 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.232296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.232362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.232376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.232401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.232417 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.335427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.335491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.335501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.335519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.335529 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.438734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.438802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.438818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.438838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.438852 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.541936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.541988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.541999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.542017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.542032 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.644682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.644767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.644785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.644820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.644839 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.747785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.747850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.747890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.747906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.747920 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.851497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.851552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.851570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.851594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.851613 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.954897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.954947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.954961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.954980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.954991 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:37Z","lastTransitionTime":"2026-02-21T21:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.968856 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 20:27:54.776017349 +0000 UTC Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.975427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:37 crc kubenswrapper[4717]: E0221 21:47:37.976380 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:37 crc kubenswrapper[4717]: I0221 21:47:37.976993 4717 scope.go:117] "RemoveContainer" containerID="651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.057458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.057507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.057524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.057545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.057590 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.159902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.160333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.160343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.160359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.160374 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.263359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.263405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.263419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.263436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.263450 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.365650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.365736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.365754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.365776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.365792 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.468382 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.468441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.468454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.468473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.468487 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.492470 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/2.log" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.496405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.496840 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.512345 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.527080 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.544218 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.560825 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.571216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.571249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.571316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.571335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.571345 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.572957 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.586160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.606219 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.619490 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.639355 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.651661 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.665186 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.674110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.674152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.674161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.674178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.674190 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.683931 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.700743 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.715547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.731381 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.745795 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.763650 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:38Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.776743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.776796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.776807 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.776828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.776841 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.879551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.879613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.879626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.879644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.879659 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.969490 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 08:20:40.833019733 +0000 UTC Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.975822 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.975918 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.976023 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:38 crc kubenswrapper[4717]: E0221 21:47:38.976104 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:38 crc kubenswrapper[4717]: E0221 21:47:38.976249 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:38 crc kubenswrapper[4717]: E0221 21:47:38.976293 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.982841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.982893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.982904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.982923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:38 crc kubenswrapper[4717]: I0221 21:47:38.982937 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:38Z","lastTransitionTime":"2026-02-21T21:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.085902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.085959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.085970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.085993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.086006 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.189535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.189603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.189621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.189651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.189672 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.293441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.293527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.293552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.293584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.293608 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.397222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.397287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.397299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.397319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.397330 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.500025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.500088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.500108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.500130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.500146 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.506106 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/3.log" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.507269 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/2.log" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.510772 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" exitCode=1 Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.510830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.513133 4717 scope.go:117] "RemoveContainer" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" Feb 21 21:47:39 crc kubenswrapper[4717]: E0221 21:47:39.513400 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.513624 4717 scope.go:117] "RemoveContainer" containerID="651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.533198 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.548337 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.564948 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.587287 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.603359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.603412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.603425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.603443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.603456 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.608155 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.628199 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.645395 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.664538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.682802 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.702964 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.706583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.706650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.706660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.706678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.706692 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.731595 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651bc5eaff312e9ed0a9914b563b3cce2af7e108a281d6bc2d881a18fe5e503d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:11Z\\\",\\\"message\\\":\\\"es.lbConfig(nil)\\\\nI0221 21:47:11.426421 6368 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0221 21:47:11.426466 6368 services_controller.go:452] Built service openshift-dns-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0221 21:47:11.426475 6368 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:38Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0221 21:47:38.954904 6734 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0221 21:47:38.954939 6734 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0221 21:47:38.954987 6734 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0221 21:47:38.954997 6734 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0221 21:47:38.955021 6734 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0221 21:47:38.955044 6734 handler.go:208] Removed *v1.Node event handler 2\\\\nI0221 21:47:38.955085 6734 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0221 21:47:38.955099 6734 factory.go:656] Stopping watch factory\\\\nI0221 21:47:38.955110 6734 handler.go:208] Removed *v1.Node event handler 7\\\\nI0221 21:47:38.955117 6734 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0221 21:47:38.955131 6734 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0221 21:47:38.955378 6734 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0221 21:47:38.955511 6734 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0221 21:47:38.955580 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:47:38.955635 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0221 21:47:38.955737 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.742421 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.761452 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.782033 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.803834 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.809428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.809473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.809507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.809526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.809537 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.822365 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.845537 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:39Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.912351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.912444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.912468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.912496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.912516 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:39Z","lastTransitionTime":"2026-02-21T21:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.971201 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:14:42.635047323 +0000 UTC Feb 21 21:47:39 crc kubenswrapper[4717]: I0221 21:47:39.975679 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:39 crc kubenswrapper[4717]: E0221 21:47:39.975909 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.015038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.015100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.015114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.015133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.015146 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.118071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.118131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.118142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.118161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.118174 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.222831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.222942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.222962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.222989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.223011 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.326545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.326620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.326638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.326663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.326682 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.429948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.430023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.430048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.430079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.430103 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.517640 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/3.log" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.523496 4717 scope.go:117] "RemoveContainer" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" Feb 21 21:47:40 crc kubenswrapper[4717]: E0221 21:47:40.524065 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.535205 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.535292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.535318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.535351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.535373 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.542616 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.563805 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.581961 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.598365 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.613159 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.632266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.638820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.639010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.639031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.639099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.639120 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.654175 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.668844 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.689567 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.710469 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.729282 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.750071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.750114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.750132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.750155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.750172 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.751223 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.784717 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:38Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0221 21:47:38.954904 6734 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0221 21:47:38.954939 6734 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0221 21:47:38.954987 6734 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0221 21:47:38.954997 6734 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0221 21:47:38.955021 6734 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0221 21:47:38.955044 6734 handler.go:208] Removed *v1.Node event handler 2\\\\nI0221 21:47:38.955085 6734 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0221 21:47:38.955099 6734 factory.go:656] Stopping watch factory\\\\nI0221 21:47:38.955110 6734 handler.go:208] Removed *v1.Node event handler 7\\\\nI0221 21:47:38.955117 6734 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0221 21:47:38.955131 6734 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0221 21:47:38.955378 6734 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0221 21:47:38.955511 6734 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0221 21:47:38.955580 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:47:38.955635 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0221 21:47:38.955737 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.803392 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.823633 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.847028 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.852608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.852639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.852649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.852666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.852677 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.865426 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:40Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.957124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.957209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.957227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.957254 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.957273 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:40Z","lastTransitionTime":"2026-02-21T21:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.971391 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 19:40:09.274366741 +0000 UTC Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.975815 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.975987 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:40 crc kubenswrapper[4717]: E0221 21:47:40.976077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:40 crc kubenswrapper[4717]: E0221 21:47:40.976237 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:40 crc kubenswrapper[4717]: I0221 21:47:40.976370 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:40 crc kubenswrapper[4717]: E0221 21:47:40.976490 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.061401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.061457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.061474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.061501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.061520 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.165908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.165992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.166013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.166043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.166062 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.270199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.270263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.270283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.270311 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.270333 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.373675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.373715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.373726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.373742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.373752 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.478329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.478369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.478381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.478395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.478405 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.583089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.583167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.583188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.583221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.583246 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.686546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.686621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.686641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.687001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.687054 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.791338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.791432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.791456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.791490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.791555 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.895144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.895240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.895260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.895291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.895313 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.972446 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 05:00:49.326509903 +0000 UTC Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.976095 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:41 crc kubenswrapper[4717]: E0221 21:47:41.976354 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.998957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.999092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.999116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.999146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:41 crc kubenswrapper[4717]: I0221 21:47:41.999167 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:41Z","lastTransitionTime":"2026-02-21T21:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.103059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.103201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.103221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.103246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.103269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.207094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.207168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.207194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.207230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.207255 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.311504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.311571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.311591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.311619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.311639 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.415746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.415798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.415815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.415900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.415917 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.518748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.518810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.518827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.518850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.518959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.622080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.622141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.622162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.622187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.622203 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.726456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.726538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.726559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.726589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.726611 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.830536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.830614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.830634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.830664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.830687 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.934520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.934578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.934597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.934620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.934638 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:42Z","lastTransitionTime":"2026-02-21T21:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.973562 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 04:27:40.316268532 +0000 UTC Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.975954 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.975996 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:42 crc kubenswrapper[4717]: I0221 21:47:42.976161 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:42 crc kubenswrapper[4717]: E0221 21:47:42.976314 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:42 crc kubenswrapper[4717]: E0221 21:47:42.976490 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:42 crc kubenswrapper[4717]: E0221 21:47:42.976610 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.037808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.037915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.037939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.037970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.037989 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.142810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.142909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.142928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.142986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.143004 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.246242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.246308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.246325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.246349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.246367 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.349369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.349480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.349500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.349562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.349586 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.452609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.452675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.452693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.452720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.452744 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.463333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.463394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.463410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.463430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.463450 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.485982 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:43Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.493255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.493321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.493348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.493377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.493395 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.515576 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:43Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.521841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.522120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.522283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.522444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.522573 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.544154 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:43Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.549101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.549151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.549169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.549192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.549208 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.568815 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:43Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.574095 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.574323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.574496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.574632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.574771 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.606849 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:43Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.607235 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.610121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.610232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.610256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.610314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.610331 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.713927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.714047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.714070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.714132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.714150 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.819548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.819605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.819623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.819651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.819667 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.922559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.922625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.922643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.922673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.922696 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:43Z","lastTransitionTime":"2026-02-21T21:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.974418 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 17:27:31.404975089 +0000 UTC Feb 21 21:47:43 crc kubenswrapper[4717]: I0221 21:47:43.975943 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:43 crc kubenswrapper[4717]: E0221 21:47:43.976136 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.027358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.027412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.027427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.027450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.027466 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.131559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.131651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.131675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.131706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.131728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.235037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.235097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.235116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.235143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.235162 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.338880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.338941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.338959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.338997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.339015 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.442110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.442189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.442209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.442237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.442257 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.545161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.545257 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.545284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.545317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.545338 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.648279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.648336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.648359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.648393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.648432 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.751455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.751510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.751529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.751553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.751573 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.855652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.855709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.855723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.855743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.855755 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.959062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.959142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.959166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.959196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.959217 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:44Z","lastTransitionTime":"2026-02-21T21:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.975662 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 04:47:43.218532719 +0000 UTC Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.975806 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.975850 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:44 crc kubenswrapper[4717]: I0221 21:47:44.975955 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:44 crc kubenswrapper[4717]: E0221 21:47:44.975991 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:44 crc kubenswrapper[4717]: E0221 21:47:44.976077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:44 crc kubenswrapper[4717]: E0221 21:47:44.976206 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.062998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.063490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.063700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.063942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.064151 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.168522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.168586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.168608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.168640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.168671 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.310680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.310754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.310772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.310800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.310817 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.414135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.414212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.414226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.414251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.414269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.517503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.517594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.517614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.517645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.517669 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.621304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.621390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.621404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.621427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.621442 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.724775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.724831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.724847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.724886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.724900 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.829146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.829239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.829295 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.829321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.829337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.932102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.932145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.932157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.932175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.932185 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:45Z","lastTransitionTime":"2026-02-21T21:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.976197 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 05:59:28.021499114 +0000 UTC Feb 21 21:47:45 crc kubenswrapper[4717]: I0221 21:47:45.977217 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:45 crc kubenswrapper[4717]: E0221 21:47:45.980121 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.006277 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.028695 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.035165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.035215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.035237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.035268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.035290 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.048613 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.064779 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.086822 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.111609 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.129547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.137741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.137843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.137876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.137896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.137908 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.147316 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.161267 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.177438 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.192398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.213781 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.243157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.243198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.243210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.243225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.243236 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.245446 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:38Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0221 21:47:38.954904 6734 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0221 21:47:38.954939 6734 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0221 21:47:38.954987 6734 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0221 21:47:38.954997 6734 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0221 21:47:38.955021 6734 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0221 21:47:38.955044 6734 handler.go:208] Removed *v1.Node event handler 2\\\\nI0221 21:47:38.955085 6734 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0221 21:47:38.955099 6734 factory.go:656] Stopping watch factory\\\\nI0221 21:47:38.955110 6734 handler.go:208] Removed *v1.Node event handler 7\\\\nI0221 21:47:38.955117 6734 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0221 21:47:38.955131 6734 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0221 21:47:38.955378 6734 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0221 21:47:38.955511 6734 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0221 21:47:38.955580 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:47:38.955635 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0221 21:47:38.955737 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.263382 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.280926 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.303649 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.325780 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:46Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.351330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.351387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.351400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.351421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.351435 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.455332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.455377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.455387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.455405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.455417 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.561301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.561387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.561407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.561436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.561453 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.664702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.664768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.664786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.664816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.664842 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.767945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.768021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.768043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.768074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.768096 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.871827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.871916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.871934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.871958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.871974 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975308 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975325 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.975440 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:46Z","lastTransitionTime":"2026-02-21T21:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:46 crc kubenswrapper[4717]: E0221 21:47:46.975694 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:46 crc kubenswrapper[4717]: E0221 21:47:46.975850 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.976215 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:46 crc kubenswrapper[4717]: E0221 21:47:46.976370 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:46 crc kubenswrapper[4717]: I0221 21:47:46.976497 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 21:48:58.533367479 +0000 UTC Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.078856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.078978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.079008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.079039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.079062 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.182121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.182186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.182211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.182241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.182267 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.285437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.285503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.285519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.285545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.285566 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.388796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.388852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.388894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.388915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.388928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.491994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.492063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.492078 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.492104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.492119 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.594985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.595065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.595090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.595122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.595148 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.698579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.699081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.699103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.699130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.699149 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.803024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.803127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.803158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.803193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.803214 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.906981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.907054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.907075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.907101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.907122 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:47Z","lastTransitionTime":"2026-02-21T21:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.979176 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 13:50:42.274656939 +0000 UTC Feb 21 21:47:47 crc kubenswrapper[4717]: I0221 21:47:47.979467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:47 crc kubenswrapper[4717]: E0221 21:47:47.979752 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.009426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.009482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.009506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.009537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.009557 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.112653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.112716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.112739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.112769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.112790 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.216195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.216235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.216252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.216275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.216292 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.320152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.320235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.320261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.320293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.320317 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.423931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.424009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.424028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.424054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.424070 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.527553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.527629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.527655 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.527685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.527714 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.631067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.631180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.631202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.631233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.631257 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.735318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.735405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.735429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.735458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.735478 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.839337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.839408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.839429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.839456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.839477 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.943202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.943314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.943333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.943357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.943375 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:48Z","lastTransitionTime":"2026-02-21T21:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.976097 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.976150 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.976221 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:48 crc kubenswrapper[4717]: E0221 21:47:48.976297 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:48 crc kubenswrapper[4717]: E0221 21:47:48.976487 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:48 crc kubenswrapper[4717]: E0221 21:47:48.976624 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:48 crc kubenswrapper[4717]: I0221 21:47:48.980497 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 20:34:15.029997705 +0000 UTC Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.050796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.050921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.050943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.050973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.050995 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.155851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.155959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.155977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.156002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.156020 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.259664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.259721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.259738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.259759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.259781 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.362677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.362747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.362770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.362800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.362823 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.466166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.466691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.466709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.466737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.466755 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.569794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.569920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.569946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.569977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.570000 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.602153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.602389 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.602530 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.602500352 +0000 UTC m=+148.384034014 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.673759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.673831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.673853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.673925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.673948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.703232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.703354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.703414 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.703521 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.703488672 +0000 UTC m=+148.485022334 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.703564 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.703595 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.703619 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.703743 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.703715547 +0000 UTC m=+148.485249209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.776033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.776114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.776139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.776171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.776192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.804496 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.804667 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.804629326 +0000 UTC m=+148.586163008 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.804790 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.805070 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.805112 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.805147 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.805223 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.80520055 +0000 UTC m=+148.586734202 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.880368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.880445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.880469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.880501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.880524 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.975390 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:49 crc kubenswrapper[4717]: E0221 21:47:49.975626 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.981227 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 13:20:31.469121019 +0000 UTC Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.985748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.985827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.985854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.985921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:49 crc kubenswrapper[4717]: I0221 21:47:49.985944 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:49Z","lastTransitionTime":"2026-02-21T21:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.005656 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.089495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.089555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.089573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.089605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.089632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.192667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.193291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.193451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.193604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.193734 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.297608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.297680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.297700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.297724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.297743 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.401287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.401354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.401379 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.401408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.401428 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.504314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.504391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.504414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.504444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.504531 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.607780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.607913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.607933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.607963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.607981 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.710980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.711043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.711061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.711084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.711103 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.814041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.814122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.814142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.814177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.814194 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.917692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.917777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.917798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.917823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.917840 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:50Z","lastTransitionTime":"2026-02-21T21:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.975927 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.976007 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:50 crc kubenswrapper[4717]: E0221 21:47:50.976126 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.976032 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:50 crc kubenswrapper[4717]: E0221 21:47:50.976234 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:50 crc kubenswrapper[4717]: E0221 21:47:50.976575 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:50 crc kubenswrapper[4717]: I0221 21:47:50.981952 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 14:44:09.826791315 +0000 UTC Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.022349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.022408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.022426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.022451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.022470 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.125772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.126281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.126461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.126608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.126740 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.229927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.230004 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.230030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.230059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.230082 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.332808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.332913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.332936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.332959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.332975 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.436754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.436814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.436834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.436897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.436920 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.539645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.539719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.539755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.539790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.539815 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.642591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.642660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.642679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.642704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.642728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.746272 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.746356 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.746378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.746411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.746432 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.850366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.850434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.850452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.850481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.850500 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.955670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.955718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.955766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.955790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.955806 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:51Z","lastTransitionTime":"2026-02-21T21:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.976381 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:51 crc kubenswrapper[4717]: E0221 21:47:51.976730 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:51 crc kubenswrapper[4717]: I0221 21:47:51.982066 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:02:08.356050979 +0000 UTC Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.059331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.059387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.059404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.059432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.059450 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.163001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.163067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.163088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.163111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.163128 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.267804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.267927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.267956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.268022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.268041 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.371673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.371738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.371756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.371778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.371797 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.474768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.474809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.474818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.474832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.474840 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.578129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.578199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.578219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.578246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.578268 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.681925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.681991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.682010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.682034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.682052 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.785141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.785202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.785223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.785279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.785299 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.899928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.899987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.900005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.900027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.900043 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:52Z","lastTransitionTime":"2026-02-21T21:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.976322 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:52 crc kubenswrapper[4717]: E0221 21:47:52.976510 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.976626 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.976649 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:52 crc kubenswrapper[4717]: E0221 21:47:52.977154 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:52 crc kubenswrapper[4717]: E0221 21:47:52.977334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.982710 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 10:40:50.33785356 +0000 UTC Feb 21 21:47:52 crc kubenswrapper[4717]: I0221 21:47:52.995056 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.003410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.003474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.003499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.003529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.003552 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.106513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.106581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.106604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.106632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.106654 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.210081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.210158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.210181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.210209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.210229 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.314138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.314191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.314209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.314231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.314247 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.416929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.416989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.417012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.417040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.417063 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.522633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.522683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.522699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.522721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.522738 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.626025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.626143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.626672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.626773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.627126 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.666079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.666154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.666174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.666635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.666705 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.688562 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.694061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.694133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.694162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.694200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.694228 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.714629 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.721416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.721489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.721515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.721546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.721573 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.746236 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.752186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.752275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.752306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.752358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.752381 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.773912 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.779012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.779106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.779128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.779157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.779177 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.797620 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:53Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.797972 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.800652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.800748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.800768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.800795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.800813 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.903634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.903722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.903741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.903762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.903779 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:53Z","lastTransitionTime":"2026-02-21T21:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.976091 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.976312 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.977452 4717 scope.go:117] "RemoveContainer" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" Feb 21 21:47:53 crc kubenswrapper[4717]: E0221 21:47:53.977784 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:47:53 crc kubenswrapper[4717]: I0221 21:47:53.982982 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 10:51:47.870461741 +0000 UTC Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.006593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.006678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.006704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.006735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.006762 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.111039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.111122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.111141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.111170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.111192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.215102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.215181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.215201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.215228 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.215245 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.319019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.319094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.319115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.319144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.319167 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.421800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.421900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.421920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.421946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.421964 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.525834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.525938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.525962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.525997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.526018 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.631390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.631460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.631479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.631509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.631530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.736100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.736181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.736206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.736241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.736269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.840327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.840412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.840432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.840464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.840486 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.944283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.944362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.944383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.944410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.944431 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:54Z","lastTransitionTime":"2026-02-21T21:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.976324 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.976384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.976331 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:54 crc kubenswrapper[4717]: E0221 21:47:54.976553 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:54 crc kubenswrapper[4717]: E0221 21:47:54.976753 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:54 crc kubenswrapper[4717]: E0221 21:47:54.976954 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:54 crc kubenswrapper[4717]: I0221 21:47:54.983248 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 06:15:49.561655351 +0000 UTC Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.047940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.048005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.048027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.048062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.048089 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.151165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.151237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.151259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.151289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.151311 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.254758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.254847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.254921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.254955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.254981 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.357716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.357785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.357804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.357829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.357849 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.462046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.462130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.462153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.462184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.462207 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.566023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.566104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.566131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.566165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.566192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.669537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.669618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.669637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.669666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.669687 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.773476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.773548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.773568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.773605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.773626 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.877725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.877817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.877847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.877932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.877962 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.976412 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:55 crc kubenswrapper[4717]: E0221 21:47:55.976668 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.982177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.982273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.982297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.982328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.982347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:55Z","lastTransitionTime":"2026-02-21T21:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.983388 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 12:51:15.872201734 +0000 UTC Feb 21 21:47:55 crc kubenswrapper[4717]: I0221 21:47:55.998995 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:55Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.021612 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.042318 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.068024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.091540 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.117408 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.140969 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.157791 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"190c3fdb-77ff-4cde-9a39-6c866a164001\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955da883505373fb420d68cfa68a9cbb458ab8b9d419a5948b02ff33fe41b14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185793e7e1f802a2b3276fcf193733a50ae99245b3c9dcfab11256e857b70eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://185793e7e1f802a2b3276fcf193733a50ae99245b3c9dcfab11256e857b70eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.909525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.909585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.909597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.909617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.909628 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:56Z","lastTransitionTime":"2026-02-21T21:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.917016 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.937723 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.964590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5da8c30-ff59-42a6-8305-27e0f12d730d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9703a4c04285be2b1a9b9a1b681bd354bc2266389c4a64aa7e5bbcfb2d8906f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86446202e0d26ca2c2c3973d810ef949382d1981b62e3319b17fc4ff201378e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd653efc0f8ae7e2a2244598a51c694ea9f209e3419c9ad8ba135fbab546647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc6cf6818d83e685016e62a94546c4e7b5e405af261ee0269db9dc1be501e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://625bcd2589c9c533c58a35ce056454884d93dccfc75cbc494e6063b8d8fefb39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f730b67d5e86f2248b208a0531cc5838750b086d4aa81eafb38791af7b683d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f730b67d5e86f2248b208a0531cc5838750b086d4aa81eafb38791af7b683d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7040d4b6e66eec274468a6da614a01d836f977669de37f0735d39b7fb6ad5b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7040d4b6e66eec274468a6da614a01d836f977669de37f0735d39b7fb6ad5b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e459081a802e9d60a5ee2f1b48cbd144b283561477c74f718bf6dfd680d5b56c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e459081a802e9d60a5ee2f1b48cbd144b283561477c74f718bf6dfd680d5b56c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.975587 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.975644 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.975714 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:56 crc kubenswrapper[4717]: E0221 21:47:56.975803 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:56 crc kubenswrapper[4717]: E0221 21:47:56.976021 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:56 crc kubenswrapper[4717]: E0221 21:47:56.976289 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.982301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.984503 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:43:55.821649795 +0000 UTC Feb 21 21:47:56 crc kubenswrapper[4717]: I0221 21:47:56.999711 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:56Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.013884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.013952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.013971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.013995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.014012 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.017067 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:57Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.047372 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:38Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0221 21:47:38.954904 6734 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0221 21:47:38.954939 6734 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0221 21:47:38.954987 6734 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0221 21:47:38.954997 6734 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0221 21:47:38.955021 6734 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0221 21:47:38.955044 6734 handler.go:208] Removed *v1.Node event handler 2\\\\nI0221 21:47:38.955085 6734 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0221 21:47:38.955099 6734 factory.go:656] Stopping watch factory\\\\nI0221 21:47:38.955110 6734 handler.go:208] Removed *v1.Node event handler 7\\\\nI0221 21:47:38.955117 6734 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0221 21:47:38.955131 6734 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0221 21:47:38.955378 6734 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0221 21:47:38.955511 6734 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0221 21:47:38.955580 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:47:38.955635 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0221 21:47:38.955737 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:57Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.062821 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:57Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.078265 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:57Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.099486 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:57Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.117523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.117603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.117622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.117663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.117685 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.120714 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:47:57Z is after 2025-08-24T17:21:41Z" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.221775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.221922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.221952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.221986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.222010 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.325547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.325611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.325633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.325661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.325682 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.428578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.428656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.428681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.428718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.428742 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.532312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.532380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.532397 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.532424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.532441 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.635841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.635954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.635979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.636009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.636034 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.740056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.740119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.740132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.740155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.740171 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.843945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.845240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.845265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.845309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.845330 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.949363 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.949460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.949480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.949507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.949525 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:57Z","lastTransitionTime":"2026-02-21T21:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.975411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:57 crc kubenswrapper[4717]: E0221 21:47:57.975600 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:57 crc kubenswrapper[4717]: I0221 21:47:57.985212 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 13:24:58.861124062 +0000 UTC Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.053220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.053297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.053317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.053347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.053372 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.156914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.156994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.157013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.157042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.157060 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.261577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.261653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.261669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.261702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.261720 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.364964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.365055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.365078 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.365108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.365132 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.474887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.474946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.474967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.474994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.475013 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.579120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.579259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.579352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.579440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.579470 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.683507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.683561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.683573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.683589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.683600 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.787009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.787068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.787080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.787099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.787112 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.889835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.889932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.889950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.889974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.889992 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.975406 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.975455 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.975492 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:47:58 crc kubenswrapper[4717]: E0221 21:47:58.975628 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:47:58 crc kubenswrapper[4717]: E0221 21:47:58.975768 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:47:58 crc kubenswrapper[4717]: E0221 21:47:58.975972 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.985432 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 09:29:40.343508879 +0000 UTC Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.993152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.993212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.993232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.993259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:58 crc kubenswrapper[4717]: I0221 21:47:58.993279 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:58Z","lastTransitionTime":"2026-02-21T21:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.096443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.096510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.096527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.096587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.096609 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.200285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.200352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.200371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.200395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.200412 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.304043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.304125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.304149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.304179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.304199 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.407934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.407999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.408019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.408043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.408061 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.511708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.511772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.511791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.511817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.511835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.615625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.615692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.615705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.615728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.615741 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.719214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.719283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.719302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.719327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.719344 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.822296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.822359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.822371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.822391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.822405 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.926065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.926143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.926170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.926199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.926224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:47:59Z","lastTransitionTime":"2026-02-21T21:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.975531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:47:59 crc kubenswrapper[4717]: E0221 21:47:59.975758 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:47:59 crc kubenswrapper[4717]: I0221 21:47:59.986577 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 08:16:24.83982213 +0000 UTC Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.030826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.030933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.030952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.030978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.030998 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.134573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.134646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.134666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.134693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.134712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.237277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.237353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.237377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.237405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.237428 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.341399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.341474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.341494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.341524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.341546 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.445513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.445586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.445604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.445629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.445649 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.549425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.549508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.549534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.549559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.549576 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.653888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.653949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.653968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.653991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.654010 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.757442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.757526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.757549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.757578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.757603 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.861244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.861322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.861346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.861455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.861540 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.964951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.965028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.965053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.965084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.965106 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:00Z","lastTransitionTime":"2026-02-21T21:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.975422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.975424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:00 crc kubenswrapper[4717]: E0221 21:48:00.975594 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.975437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:00 crc kubenswrapper[4717]: E0221 21:48:00.975730 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:00 crc kubenswrapper[4717]: E0221 21:48:00.975980 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:00 crc kubenswrapper[4717]: I0221 21:48:00.987167 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:01:59.700578495 +0000 UTC Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.068752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.068823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.068844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.068915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.068940 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.171790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.171917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.171937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.171962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.171980 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.274977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.275102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.275122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.275150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.275171 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.380490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.380568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.380591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.380617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.380635 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.483900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.483996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.484023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.484061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.484087 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.588320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.588920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.588945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.588974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.588997 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.693005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.693072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.693090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.693113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.693133 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.796981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.797045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.797062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.797087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.797107 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.899690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.899766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.899787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.899817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.899838 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:01Z","lastTransitionTime":"2026-02-21T21:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.975629 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:01 crc kubenswrapper[4717]: E0221 21:48:01.975930 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:01 crc kubenswrapper[4717]: I0221 21:48:01.988326 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 09:48:06.255815309 +0000 UTC Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.002766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.002825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.002845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.002895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.002920 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.106044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.106117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.106143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.106172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.106194 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.210026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.210089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.210109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.210135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.210152 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.313235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.313309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.313329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.313358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.313375 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.416948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.417010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.417027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.417052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.417071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.520493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.520578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.520594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.520622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.520638 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.624106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.624170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.624188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.624215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.624233 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.728171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.728256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.728289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.728320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.728347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.832551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.832618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.832637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.832661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.832679 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.935466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.935531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.935549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.935576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.935593 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:02Z","lastTransitionTime":"2026-02-21T21:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.975613 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:02 crc kubenswrapper[4717]: E0221 21:48:02.975809 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.976131 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:02 crc kubenswrapper[4717]: E0221 21:48:02.976264 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.977496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:02 crc kubenswrapper[4717]: E0221 21:48:02.977610 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:02 crc kubenswrapper[4717]: I0221 21:48:02.989342 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 02:42:04.695618561 +0000 UTC Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.039478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.039567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.039594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.039629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.039651 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.142756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.142833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.142906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.142942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.142964 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.246698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.246770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.246790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.246819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.246843 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.349657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.349708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.349722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.349743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.349759 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.454030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.454120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.454146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.454184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.454213 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.558131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.558250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.558268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.558292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.558335 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.661571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.661715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.661811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.661899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.661924 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.765268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.765355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.765373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.765403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.765423 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.868723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.868842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.868901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.868933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.868959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.896300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.896468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.896491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.896555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.896580 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: E0221 21:48:03.923324 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:03Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.929434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.929490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.929505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.929523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.929536 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: E0221 21:48:03.950166 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:03Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.955362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.955422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.955442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.955467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.955486 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.975823 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:03 crc kubenswrapper[4717]: E0221 21:48:03.976073 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:03 crc kubenswrapper[4717]: E0221 21:48:03.976158 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:03Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.981636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.981762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.981802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.981843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.983086 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:03Z","lastTransitionTime":"2026-02-21T21:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:03 crc kubenswrapper[4717]: I0221 21:48:03.989947 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 16:00:57.91703223 +0000 UTC Feb 21 21:48:04 crc kubenswrapper[4717]: E0221 21:48:04.006788 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:04Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.014115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.014191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.014218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.014249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.014269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: E0221 21:48:04.037700 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-21T21:48:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18974447-0c58-4cc9-b717-e1c8f74e7687\\\",\\\"systemUUID\\\":\\\"b77a7e0d-ef1a-4d7c-aafe-9153f1f2e1ec\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:04Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:04 crc kubenswrapper[4717]: E0221 21:48:04.037844 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.039967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.040039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.040059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.040085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.040102 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.143597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.143670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.143689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.143717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.143733 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.246600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.246664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.246685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.246714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.246732 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.350050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.350120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.350141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.350168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.350186 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.453465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.453546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.453570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.453599 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.453623 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.556816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.556907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.556929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.556953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.556970 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.660601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.660713 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.660733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.660796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.660816 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.764101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.764170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.764189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.764214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.764236 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.868069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.868140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.868158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.868239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.868261 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.972274 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.972356 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.972376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.972409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.972427 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:04Z","lastTransitionTime":"2026-02-21T21:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.975787 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.975899 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.975956 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:04 crc kubenswrapper[4717]: E0221 21:48:04.976128 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:04 crc kubenswrapper[4717]: E0221 21:48:04.976246 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:04 crc kubenswrapper[4717]: E0221 21:48:04.976608 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:04 crc kubenswrapper[4717]: I0221 21:48:04.990151 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 20:28:16.765163796 +0000 UTC Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.076426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.076552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.076563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.076583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.076594 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.179553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.179637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.179659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.179693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.179717 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.282933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.282995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.283014 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.283041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.283062 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.387130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.387201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.387220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.387252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.387275 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.506014 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.506068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.506087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.506119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.506138 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.609828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.609968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.609996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.610037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.610066 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.713093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.713146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.713165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.713192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.713212 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.817322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.817362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.817372 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.817391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.817403 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.920213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.920262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.920279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.920307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.920327 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:05Z","lastTransitionTime":"2026-02-21T21:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.976281 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:05 crc kubenswrapper[4717]: E0221 21:48:05.976498 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.978921 4717 scope.go:117] "RemoveContainer" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" Feb 21 21:48:05 crc kubenswrapper[4717]: E0221 21:48:05.979187 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.991344 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 13:33:32.559501581 +0000 UTC Feb 21 21:48:05 crc kubenswrapper[4717]: I0221 21:48:05.992304 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058926f7-024e-464a-96a7-3e96a96affc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e679e422ecb9d4c9dc0d8e3f6cdf59541344ccc353a0ef795a1b988ff9953f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vm86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:05Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.011642 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8203b79d-1367-43b6-8567-797ec1b0c09b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wz2v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gt2bg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.024550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.024670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.024695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.024727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.024754 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.033853 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e2cc00d616524f204158146cfb74a033fae9584e58c6e5dffe2a17896d3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.056840 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.074410 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dg4jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abf9538-30e5-4e8e-8084-ecf9eee7e364\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a91b4aa08f59a287b2fdef05706363e37f319375e17555e2582a98a262691d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9c55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dg4jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.100334 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bzd94" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:35Z\\\",\\\"message\\\":\\\"2026-02-21T21:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43\\\\n2026-02-21T21:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d10a5d4-0cf8-48c8-8181-befb5c4fbf43 to /host/opt/cni/bin/\\\\n2026-02-21T21:46:50Z [verbose] multus-daemon started\\\\n2026-02-21T21:46:50Z [verbose] Readiness Indicator file check\\\\n2026-02-21T21:47:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hxjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bzd94\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.129244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.129322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.129341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.129369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.129386 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.133588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a10be9-c25d-42c3-9a4f-e2397cc64852\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-21T21:47:38Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0221 21:47:38.954904 6734 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0221 21:47:38.954939 6734 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0221 21:47:38.954987 6734 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0221 21:47:38.954997 6734 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0221 21:47:38.955021 6734 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0221 21:47:38.955044 6734 handler.go:208] Removed *v1.Node event handler 2\\\\nI0221 21:47:38.955085 6734 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0221 21:47:38.955099 6734 factory.go:656] Stopping watch factory\\\\nI0221 21:47:38.955110 6734 handler.go:208] Removed *v1.Node event handler 7\\\\nI0221 21:47:38.955117 6734 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0221 21:47:38.955131 6734 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0221 21:47:38.955378 6734 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0221 21:47:38.955511 6734 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0221 21:47:38.955580 6734 ovnkube.go:599] Stopped ovnkube\\\\nI0221 21:47:38.955635 6734 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0221 21:47:38.955737 6734 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:47:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7ndm2_openshift-ovn-kubernetes(f6a10be9-c25d-42c3-9a4f-e2397cc64852)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78fqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7ndm2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.156108 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f060308-c477-4d3e-86bd-f42465c25807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7dc5784fa1e98fe19739f6cb7c56551edfd42ff37207f3f3ab23d778c7d7fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71d18708144d863bd45eb847e3c6f9db2c45e24ee6d458aea67c4a438e54feaf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://147e77fa5eae73a08eb2b6ecc554b7435fdcf9857b0f62d66d7acbb29927b131\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.176775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc5eeb62-90d6-4f10-9b58-f147b23eb775\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899466ea9207540bae97b8dc4df8627fcc3cd7e6a7ce83cef9b0a2e9eb50c104\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnjzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-flt22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.194812 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"287b67d9-4a4e-4dcc-9723-70c8ac00c1ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e09931e29112e9ec237d2e9aeaf1ec5b26b05afe456ef9aa30b264fd4e53404c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f4a02f6e4678eba7ab14081252841fc9926950e9f12fc0ac16a0fec0e26c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:47:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rtmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m58jv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.216470 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8fdcf84-b93a-45e0-aaf0-170c282e61d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-21T21:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0221 21:46:39.830076 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0221 21:46:39.838929 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1885294131/tls.crt::/tmp/serving-cert-1885294131/tls.key\\\\\\\"\\\\nI0221 21:46:45.695574 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0221 21:46:45.700237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0221 21:46:45.700342 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0221 21:46:45.700437 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0221 21:46:45.700494 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0221 21:46:45.711254 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0221 21:46:45.711296 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0221 21:46:45.711421 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0221 21:46:45.711539 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0221 21:46:45.711584 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0221 21:46:45.711631 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0221 21:46:45.711675 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0221 21:46:45.716355 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.233795 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"190c3fdb-77ff-4cde-9a39-6c866a164001\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955da883505373fb420d68cfa68a9cbb458ab8b9d419a5948b02ff33fe41b14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://185793e7e1f802a2b3276fcf193733a50ae99245b3c9dcfab11256e857b70eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://185793e7e1f802a2b3276fcf193733a50ae99245b3c9dcfab11256e857b70eb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.233957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.234032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.234052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.234080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.234098 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.256926 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36ef16911e658e4830600a9bf7f528e8f680c6e53343edf0a86b525969d79367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8584c658aaae3523c1db97516cc498b00ff096d9f8c854a47c1eadd2a6397ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.278007 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45647260a0e7465f6d6b8218c8a531bf560b5e299934e7c7569aa9cc2d789318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.302643 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-82vcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d99f8dec-a80d-4890-b903-fe05d6d47d62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abcabcf1899c6fff13392483c47fe06d1ec860f826671a4cf8a695566598f367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a5c975619f6c6f47f6f4ba42ceb679cd496d0ef653517eeb532524039413950\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d99b076f27700b6e109fdc6fd8201896555a686cbb32d027baf76be52a653872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6084e444b900e726c0a4c35c497571b7d647a385646fc99e9123beab016e1e9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://768b7b517f321e8b48fbb23058ea3e129cf2381f19801b622c3ae53c7f0924d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efc3dbf9816d222addda5337235d779201783024af38d7b8de3f4c4fe68907ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://486bdce2531a288e940a1dfd29623e053555c6fa54e40eee02c192f8443f465d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56cgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-82vcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.337165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.337220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.337238 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.337264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.337287 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.339680 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5da8c30-ff59-42a6-8305-27e0f12d730d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9703a4c04285be2b1a9b9a1b681bd354bc2266389c4a64aa7e5bbcfb2d8906f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86446202e0d26ca2c2c3973d810ef949382d1981b62e3319b17fc4ff201378e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd653efc0f8ae7e2a2244598a51c694ea9f209e3419c9ad8ba135fbab546647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc6cf6818d83e685016e62a94546c4e7b5e405af261ee0269db9dc1be501e4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://625bcd2589c9c533c58a35ce056454884d93dccfc75cbc494e6063b8d8fefb39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f730b67d5e86f2248b208a0531cc5838750b086d4aa81eafb38791af7b683d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f730b67d5e86f2248b208a0531cc5838750b086d4aa81eafb38791af7b683d62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7040d4b6e66eec274468a6da614a01d836f977669de37f0735d39b7fb6ad5b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7040d4b6e66eec274468a6da614a01d836f977669de37f0735d39b7fb6ad5b47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e459081a802e9d60a5ee2f1b48cbd144b283561477c74f718bf6dfd680d5b56c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e459081a802e9d60a5ee2f1b48cbd144b283561477c74f718bf6dfd680d5b56c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.362668 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"154bad15-f560-48f9-ae8e-92c12d3ae5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bfed6617bca25f3b3212d9fb2d1ae31778a863f9fa9b1fe3e9eb787dea44f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57c2fa325d2da94225841ab78a9e160e52bace1f0ec2d121b93db85323b55785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04a04b3b7590932005e60eaf68fa97fdeb071b48df00bcdbb0c65deb8fe9da9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-21T21:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92aa7b324f88e56787f77aef363707fe11ca5003b69900a31a1800b259848a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-21T21:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-21T21:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-21T21:46:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.384999 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.405662 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-21T21:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-21T21:48:06Z is after 2025-08-24T17:21:41Z" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.421420 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:06 crc kubenswrapper[4717]: E0221 21:48:06.421658 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:48:06 crc kubenswrapper[4717]: E0221 21:48:06.421770 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs podName:8203b79d-1367-43b6-8567-797ec1b0c09b nodeName:}" failed. No retries permitted until 2026-02-21 21:49:10.42173838 +0000 UTC m=+165.203272042 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs") pod "network-metrics-daemon-gt2bg" (UID: "8203b79d-1367-43b6-8567-797ec1b0c09b") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.441065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.441114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.441140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.441170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.441195 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.544582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.544634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.544651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.544674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.544690 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.648755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.649198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.649345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.649442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.649534 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.752962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.753309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.753373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.753456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.753536 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.856324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.856796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.857007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.857225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.857620 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.960755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.960826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.960844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.960898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.960919 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:06Z","lastTransitionTime":"2026-02-21T21:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.975751 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.975886 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:06 crc kubenswrapper[4717]: E0221 21:48:06.975960 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.976012 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:06 crc kubenswrapper[4717]: E0221 21:48:06.976187 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:06 crc kubenswrapper[4717]: E0221 21:48:06.976295 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:06 crc kubenswrapper[4717]: I0221 21:48:06.992392 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 13:13:38.464986802 +0000 UTC Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.064427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.064482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.064499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.064520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.064538 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.168131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.168214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.168241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.168273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.168297 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.271978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.272059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.272083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.272120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.272145 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.375024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.375093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.375112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.375135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.375155 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.478274 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.478345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.478370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.478399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.478420 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.582369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.582459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.582485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.582516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.582539 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.686133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.686204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.686244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.686278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.686300 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.789917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.789980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.789996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.790019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.790038 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.893639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.893734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.893759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.893793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.893822 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.976447 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:07 crc kubenswrapper[4717]: E0221 21:48:07.976719 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.994034 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 07:28:12.005784082 +0000 UTC Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.996406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.996454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.996471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.996494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:07 crc kubenswrapper[4717]: I0221 21:48:07.996511 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:07Z","lastTransitionTime":"2026-02-21T21:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.100296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.100373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.100392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.100419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.100435 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.203916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.203995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.204014 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.204042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.204061 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.306992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.307102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.307125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.307164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.307200 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.410355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.410446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.410471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.410505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.410529 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.517310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.517387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.517410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.517450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.517470 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.621058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.621128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.621139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.621162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.621178 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.724691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.724767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.724786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.724815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.724835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.828624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.828701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.828719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.828743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.828761 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.931966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.932052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.932071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.932101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.932124 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:08Z","lastTransitionTime":"2026-02-21T21:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.975500 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.975637 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.975737 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:08 crc kubenswrapper[4717]: E0221 21:48:08.975936 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:08 crc kubenswrapper[4717]: E0221 21:48:08.976131 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:08 crc kubenswrapper[4717]: E0221 21:48:08.976310 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:08 crc kubenswrapper[4717]: I0221 21:48:08.994971 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 22:58:53.863314835 +0000 UTC Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.035632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.035719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.035744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.035779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.035808 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.139502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.139590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.139609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.139640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.139664 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.242444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.242523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.242545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.242576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.242597 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.345628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.345716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.345738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.345767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.345790 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.449242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.449317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.449334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.449359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.449380 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.552461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.552535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.552556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.552585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.552606 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.655543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.655606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.655622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.655645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.655662 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.759075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.759144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.759163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.759191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.759212 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.862258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.862326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.862345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.862369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.862387 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.965331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.965385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.965398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.965419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.965435 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:09Z","lastTransitionTime":"2026-02-21T21:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.975895 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:09 crc kubenswrapper[4717]: E0221 21:48:09.976496 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:09 crc kubenswrapper[4717]: I0221 21:48:09.995590 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 11:38:27.783655258 +0000 UTC Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.069623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.069689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.069703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.069729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.069744 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.173964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.174043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.174064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.174090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.174108 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.277338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.277402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.277425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.277457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.277510 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.380338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.380402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.380420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.380443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.380461 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.483985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.484157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.484179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.484204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.484224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.587434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.587526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.587544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.587567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.587584 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.690634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.690692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.690710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.690732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.690750 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.794133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.794201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.794218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.794277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.794303 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.898002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.898092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.898115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.898151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.898174 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:10Z","lastTransitionTime":"2026-02-21T21:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.975792 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.975809 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:10 crc kubenswrapper[4717]: E0221 21:48:10.976060 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:10 crc kubenswrapper[4717]: E0221 21:48:10.976181 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.975816 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:10 crc kubenswrapper[4717]: E0221 21:48:10.976328 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:10 crc kubenswrapper[4717]: I0221 21:48:10.995969 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 03:20:22.517876223 +0000 UTC Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.001392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.001459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.001478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.001504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.001522 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.104556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.104638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.104657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.104684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.104702 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.208383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.208454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.208472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.208498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.208515 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.312167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.312312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.312332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.312357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.312377 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.415910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.415972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.415990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.416015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.416033 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.520016 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.520096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.520115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.520146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.520164 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.624150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.624211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.624227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.624252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.624272 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.728554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.728623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.728642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.728675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.728693 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.831969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.832042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.832067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.832102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.832123 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.935452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.935587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.935616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.935650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.935674 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:11Z","lastTransitionTime":"2026-02-21T21:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.975927 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:11 crc kubenswrapper[4717]: E0221 21:48:11.976080 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:11 crc kubenswrapper[4717]: I0221 21:48:11.997504 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:17:41.841394513 +0000 UTC Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.039392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.039460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.039481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.039507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.039533 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.143471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.143562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.143583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.143607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.143624 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.246210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.246330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.246351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.246374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.246390 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.349826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.349914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.349933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.349956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.349978 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.453069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.453135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.453152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.453178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.453224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.556331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.556408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.556429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.556461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.556480 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.660062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.660113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.660128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.660147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.660163 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.763574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.763647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.763665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.763693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.763712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.866386 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.866453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.866475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.866506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.866530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.969003 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.969099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.969125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.969176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.969197 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:12Z","lastTransitionTime":"2026-02-21T21:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.975896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.975967 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.975915 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:12 crc kubenswrapper[4717]: E0221 21:48:12.976070 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:12 crc kubenswrapper[4717]: E0221 21:48:12.976216 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:12 crc kubenswrapper[4717]: E0221 21:48:12.976313 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:12 crc kubenswrapper[4717]: I0221 21:48:12.998145 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:52:49.164348535 +0000 UTC Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.072439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.072506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.072526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.072555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.072576 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.176471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.176540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.176558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.176580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.176597 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.279703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.279785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.279812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.279851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.279903 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.383057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.383215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.383236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.383262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.383279 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.486314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.486384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.486402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.486432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.486464 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.589797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.589907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.589930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.589960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.589982 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.692809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.692940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.692963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.693001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.693025 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.795953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.796024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.796048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.796079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.796102 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.900371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.900497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.900533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.900566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.900603 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:13Z","lastTransitionTime":"2026-02-21T21:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.975797 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:13 crc kubenswrapper[4717]: E0221 21:48:13.976045 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:13 crc kubenswrapper[4717]: I0221 21:48:13.998600 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 09:34:57.057550861 +0000 UTC Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.004366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.004396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.004407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.004420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.004433 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:14Z","lastTransitionTime":"2026-02-21T21:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.107305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.107360 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.107374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.107393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.107406 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:14Z","lastTransitionTime":"2026-02-21T21:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.118785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.118906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.118928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.118955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.118975 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-21T21:48:14Z","lastTransitionTime":"2026-02-21T21:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.183199 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz"] Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.183795 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.188282 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.188827 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.190296 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.190296 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.260705 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-82vcj" podStartSLOduration=87.260664898 podStartE2EDuration="1m27.260664898s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.237463971 +0000 UTC m=+109.018997623" watchObservedRunningTime="2026-02-21 21:48:14.260664898 +0000 UTC m=+109.042198540" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.280643 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.28061422 podStartE2EDuration="1m28.28061422s" podCreationTimestamp="2026-02-21 21:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.260271629 +0000 UTC m=+109.041805291" watchObservedRunningTime="2026-02-21 21:48:14.28061422 +0000 UTC m=+109.062147872" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.281190 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.281182464 podStartE2EDuration="22.281182464s" podCreationTimestamp="2026-02-21 21:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.27933116 +0000 UTC m=+109.060864822" watchObservedRunningTime="2026-02-21 21:48:14.281182464 +0000 UTC m=+109.062716116" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.325136 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d49edd-bff7-4dbe-844e-edd96255563d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.325178 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0d49edd-bff7-4dbe-844e-edd96255563d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.325194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d49edd-bff7-4dbe-844e-edd96255563d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.325209 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0d49edd-bff7-4dbe-844e-edd96255563d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.325306 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0d49edd-bff7-4dbe-844e-edd96255563d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.364550 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=25.364526175 podStartE2EDuration="25.364526175s" podCreationTimestamp="2026-02-21 21:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.364466424 +0000 UTC m=+109.146000046" watchObservedRunningTime="2026-02-21 21:48:14.364526175 +0000 UTC m=+109.146059817" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.382511 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.382491081 podStartE2EDuration="58.382491081s" podCreationTimestamp="2026-02-21 21:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.382276117 +0000 UTC m=+109.163809739" watchObservedRunningTime="2026-02-21 21:48:14.382491081 +0000 UTC m=+109.164024713" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.414703 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bzd94" podStartSLOduration=87.414681567 podStartE2EDuration="1m27.414681567s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.41432598 +0000 UTC m=+109.195859612" watchObservedRunningTime="2026-02-21 21:48:14.414681567 +0000 UTC m=+109.196215179" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0d49edd-bff7-4dbe-844e-edd96255563d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d49edd-bff7-4dbe-844e-edd96255563d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d49edd-bff7-4dbe-844e-edd96255563d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426374 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0d49edd-bff7-4dbe-844e-edd96255563d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426391 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0d49edd-bff7-4dbe-844e-edd96255563d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426406 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0d49edd-bff7-4dbe-844e-edd96255563d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.426449 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0d49edd-bff7-4dbe-844e-edd96255563d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.427759 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0d49edd-bff7-4dbe-844e-edd96255563d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.445509 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0d49edd-bff7-4dbe-844e-edd96255563d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.449310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0d49edd-bff7-4dbe-844e-edd96255563d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4zwcz\" (UID: \"b0d49edd-bff7-4dbe-844e-edd96255563d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.477788 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7l5s2" podStartSLOduration=87.47776299 podStartE2EDuration="1m27.47776299s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.464070262 +0000 UTC m=+109.245603934" watchObservedRunningTime="2026-02-21 21:48:14.47776299 +0000 UTC m=+109.259296632" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.512461 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.559556 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dg4jx" podStartSLOduration=87.559538365 podStartE2EDuration="1m27.559538365s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.55935605 +0000 UTC m=+109.340889682" watchObservedRunningTime="2026-02-21 21:48:14.559538365 +0000 UTC m=+109.341071987" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.575839 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.575811262 podStartE2EDuration="1m25.575811262s" podCreationTimestamp="2026-02-21 21:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.575152607 +0000 UTC m=+109.356686239" watchObservedRunningTime="2026-02-21 21:48:14.575811262 +0000 UTC m=+109.357344894" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.588147 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podStartSLOduration=87.588129658 podStartE2EDuration="1m27.588129658s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.587314708 +0000 UTC m=+109.368848340" watchObservedRunningTime="2026-02-21 21:48:14.588129658 +0000 UTC m=+109.369663280" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.608665 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m58jv" podStartSLOduration=86.608638843 podStartE2EDuration="1m26.608638843s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:14.607945417 +0000 UTC m=+109.389479099" watchObservedRunningTime="2026-02-21 21:48:14.608638843 +0000 UTC m=+109.390172505" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.975385 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.975467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.975495 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:14 crc kubenswrapper[4717]: E0221 21:48:14.975642 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:14 crc kubenswrapper[4717]: E0221 21:48:14.975747 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:14 crc kubenswrapper[4717]: E0221 21:48:14.975839 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.984899 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" event={"ID":"b0d49edd-bff7-4dbe-844e-edd96255563d","Type":"ContainerStarted","Data":"ebb5610c23a8494ab9ae774f3664effd88615766dd6c4d93fbce365c6ddf1eac"} Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.984977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" event={"ID":"b0d49edd-bff7-4dbe-844e-edd96255563d","Type":"ContainerStarted","Data":"1c9e816302444fab86ce3632bee22a6ce034817e4f93cf664d70f30b09554d39"} Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.999067 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 00:57:44.810022778 +0000 UTC Feb 21 21:48:14 crc kubenswrapper[4717]: I0221 21:48:14.999164 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 21 21:48:15 crc kubenswrapper[4717]: I0221 21:48:15.007855 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4zwcz" podStartSLOduration=88.007830015 podStartE2EDuration="1m28.007830015s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:15.006996156 +0000 UTC m=+109.788529808" watchObservedRunningTime="2026-02-21 21:48:15.007830015 +0000 UTC m=+109.789363667" Feb 21 21:48:15 crc kubenswrapper[4717]: I0221 21:48:15.018315 4717 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 21 21:48:15 crc kubenswrapper[4717]: I0221 21:48:15.975379 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:15 crc kubenswrapper[4717]: E0221 21:48:15.977407 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:16 crc kubenswrapper[4717]: I0221 21:48:16.976054 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:16 crc kubenswrapper[4717]: I0221 21:48:16.976104 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:16 crc kubenswrapper[4717]: I0221 21:48:16.976130 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:16 crc kubenswrapper[4717]: E0221 21:48:16.977313 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:16 crc kubenswrapper[4717]: E0221 21:48:16.977458 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:16 crc kubenswrapper[4717]: E0221 21:48:16.977825 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:17 crc kubenswrapper[4717]: I0221 21:48:17.976278 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:17 crc kubenswrapper[4717]: E0221 21:48:17.977414 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:18 crc kubenswrapper[4717]: I0221 21:48:18.976181 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:18 crc kubenswrapper[4717]: I0221 21:48:18.976245 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:18 crc kubenswrapper[4717]: I0221 21:48:18.976296 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:18 crc kubenswrapper[4717]: E0221 21:48:18.976363 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:18 crc kubenswrapper[4717]: E0221 21:48:18.976485 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:18 crc kubenswrapper[4717]: E0221 21:48:18.976608 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:19 crc kubenswrapper[4717]: I0221 21:48:19.975567 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:19 crc kubenswrapper[4717]: E0221 21:48:19.976227 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:19 crc kubenswrapper[4717]: I0221 21:48:19.976684 4717 scope.go:117] "RemoveContainer" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" Feb 21 21:48:20 crc kubenswrapper[4717]: I0221 21:48:20.891676 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gt2bg"] Feb 21 21:48:20 crc kubenswrapper[4717]: I0221 21:48:20.892381 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:20 crc kubenswrapper[4717]: E0221 21:48:20.892496 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:20 crc kubenswrapper[4717]: I0221 21:48:20.976102 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:20 crc kubenswrapper[4717]: I0221 21:48:20.976240 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:20 crc kubenswrapper[4717]: I0221 21:48:20.976131 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:20 crc kubenswrapper[4717]: E0221 21:48:20.976310 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:20 crc kubenswrapper[4717]: E0221 21:48:20.976464 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:20 crc kubenswrapper[4717]: E0221 21:48:20.976541 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:21 crc kubenswrapper[4717]: I0221 21:48:21.010959 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/3.log" Feb 21 21:48:21 crc kubenswrapper[4717]: I0221 21:48:21.020659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerStarted","Data":"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912"} Feb 21 21:48:21 crc kubenswrapper[4717]: I0221 21:48:21.021399 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:48:21 crc kubenswrapper[4717]: I0221 21:48:21.059179 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podStartSLOduration=94.05915129 podStartE2EDuration="1m34.05915129s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:21.056665652 +0000 UTC m=+115.838199364" watchObservedRunningTime="2026-02-21 21:48:21.05915129 +0000 UTC m=+115.840684912" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.027779 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/1.log" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.028587 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/0.log" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.028687 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da" containerID="3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227" exitCode=1 Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.028810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerDied","Data":"3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227"} Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.028911 4717 scope.go:117] "RemoveContainer" containerID="938646c16c479f8c37a700f4cc403bcf0463b5cadfcda1b7831b7430ad249994" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.029914 4717 scope.go:117] "RemoveContainer" containerID="3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227" Feb 21 21:48:22 crc kubenswrapper[4717]: E0221 21:48:22.030170 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bzd94_openshift-multus(d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da)\"" pod="openshift-multus/multus-bzd94" podUID="d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.976122 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.976142 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.976188 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:22 crc kubenswrapper[4717]: I0221 21:48:22.976239 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:22 crc kubenswrapper[4717]: E0221 21:48:22.976787 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:22 crc kubenswrapper[4717]: E0221 21:48:22.977041 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:22 crc kubenswrapper[4717]: E0221 21:48:22.977186 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:22 crc kubenswrapper[4717]: E0221 21:48:22.977338 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:23 crc kubenswrapper[4717]: I0221 21:48:23.035402 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/1.log" Feb 21 21:48:24 crc kubenswrapper[4717]: I0221 21:48:24.975944 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:24 crc kubenswrapper[4717]: I0221 21:48:24.976005 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:24 crc kubenswrapper[4717]: E0221 21:48:24.976160 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:24 crc kubenswrapper[4717]: I0221 21:48:24.976469 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:24 crc kubenswrapper[4717]: E0221 21:48:24.976632 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:24 crc kubenswrapper[4717]: I0221 21:48:24.977549 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:24 crc kubenswrapper[4717]: E0221 21:48:24.977844 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:24 crc kubenswrapper[4717]: E0221 21:48:24.978593 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:25 crc kubenswrapper[4717]: E0221 21:48:25.925603 4717 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 21 21:48:26 crc kubenswrapper[4717]: E0221 21:48:26.907096 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 21 21:48:26 crc kubenswrapper[4717]: I0221 21:48:26.975961 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:26 crc kubenswrapper[4717]: I0221 21:48:26.976125 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:26 crc kubenswrapper[4717]: E0221 21:48:26.976330 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:26 crc kubenswrapper[4717]: E0221 21:48:26.976669 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:26 crc kubenswrapper[4717]: I0221 21:48:26.976936 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:26 crc kubenswrapper[4717]: I0221 21:48:26.977037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:26 crc kubenswrapper[4717]: E0221 21:48:26.977458 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:26 crc kubenswrapper[4717]: E0221 21:48:26.977637 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:28 crc kubenswrapper[4717]: I0221 21:48:28.976422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:28 crc kubenswrapper[4717]: E0221 21:48:28.976654 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:28 crc kubenswrapper[4717]: I0221 21:48:28.976771 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:28 crc kubenswrapper[4717]: I0221 21:48:28.976793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:28 crc kubenswrapper[4717]: E0221 21:48:28.977003 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:28 crc kubenswrapper[4717]: E0221 21:48:28.977069 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:28 crc kubenswrapper[4717]: I0221 21:48:28.977129 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:28 crc kubenswrapper[4717]: E0221 21:48:28.977270 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:30 crc kubenswrapper[4717]: I0221 21:48:30.975315 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:30 crc kubenswrapper[4717]: I0221 21:48:30.975482 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:30 crc kubenswrapper[4717]: I0221 21:48:30.975555 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:30 crc kubenswrapper[4717]: E0221 21:48:30.975507 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:30 crc kubenswrapper[4717]: E0221 21:48:30.975741 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:30 crc kubenswrapper[4717]: E0221 21:48:30.975789 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:30 crc kubenswrapper[4717]: I0221 21:48:30.975348 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:30 crc kubenswrapper[4717]: E0221 21:48:30.976275 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:31 crc kubenswrapper[4717]: E0221 21:48:31.908510 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 21 21:48:32 crc kubenswrapper[4717]: I0221 21:48:32.975968 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:32 crc kubenswrapper[4717]: I0221 21:48:32.976129 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:32 crc kubenswrapper[4717]: E0221 21:48:32.976200 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:32 crc kubenswrapper[4717]: I0221 21:48:32.975995 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:32 crc kubenswrapper[4717]: E0221 21:48:32.976338 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:32 crc kubenswrapper[4717]: E0221 21:48:32.976561 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:32 crc kubenswrapper[4717]: I0221 21:48:32.977537 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:32 crc kubenswrapper[4717]: I0221 21:48:32.977732 4717 scope.go:117] "RemoveContainer" containerID="3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227" Feb 21 21:48:32 crc kubenswrapper[4717]: E0221 21:48:32.978003 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:34 crc kubenswrapper[4717]: I0221 21:48:34.084682 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/1.log" Feb 21 21:48:34 crc kubenswrapper[4717]: I0221 21:48:34.085261 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerStarted","Data":"879e836c48c185e9832695059e5cff570b9f6ad5d09395cf0f9f6e7c2a7682b4"} Feb 21 21:48:34 crc kubenswrapper[4717]: I0221 21:48:34.975369 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:34 crc kubenswrapper[4717]: I0221 21:48:34.975437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:34 crc kubenswrapper[4717]: I0221 21:48:34.975465 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:34 crc kubenswrapper[4717]: E0221 21:48:34.975570 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 21 21:48:34 crc kubenswrapper[4717]: I0221 21:48:34.975603 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:34 crc kubenswrapper[4717]: E0221 21:48:34.975836 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 21 21:48:34 crc kubenswrapper[4717]: E0221 21:48:34.975978 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 21 21:48:34 crc kubenswrapper[4717]: E0221 21:48:34.976095 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gt2bg" podUID="8203b79d-1367-43b6-8567-797ec1b0c09b" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.975628 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.975686 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.975921 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.975938 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.978465 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.979421 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.979698 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.980520 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.980825 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 21 21:48:36 crc kubenswrapper[4717]: I0221 21:48:36.981584 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 21 21:48:40 crc kubenswrapper[4717]: I0221 21:48:40.420922 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.927682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.983624 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-84tzn"] Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.986163 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.993157 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ltlhm"] Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.994112 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk"] Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.994928 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-phxph"] Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.995022 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.995856 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.998827 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn"] Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.999504 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bxq5v"] Feb 21 21:48:44 crc kubenswrapper[4717]: I0221 21:48:44.999578 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.000124 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.000289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.000327 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.000982 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8cf52"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.001449 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.001457 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.006362 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-st74w"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.007238 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-g8wzz"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.007558 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.007712 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.013136 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.013565 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.022047 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b4752"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.022586 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.022903 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-76zmn"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.023307 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.023979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.024104 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.024163 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.024376 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.029530 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.029989 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030119 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030568 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030663 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.031451 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030677 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030712 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030757 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030802 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030846 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030940 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030967 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.030996 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.031037 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.031066 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.031161 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.031216 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.033961 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.038283 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.038990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.041971 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.042406 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.042931 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.042958 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.045956 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l7kgh"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.047057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.050939 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.067902 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.068103 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.069500 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.072821 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqr6g"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.073025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079052 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079206 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079251 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079470 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079607 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079730 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.079910 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080068 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080202 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080335 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080399 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080540 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080593 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080651 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080749 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080777 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.080883 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.081121 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.081133 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.082701 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.082894 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.083095 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.090929 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.090945 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091096 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091219 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091311 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091335 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091411 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091427 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091506 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091583 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091787 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.091893 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.092715 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.092819 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.092913 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.092994 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.093767 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094024 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094139 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094418 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094602 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094624 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094800 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094845 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.094943 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095046 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095125 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095139 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095185 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095219 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095327 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095425 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095535 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095630 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095669 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095831 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095870 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.095939 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.096016 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.096084 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.096163 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.096239 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.096310 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.096983 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.097074 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.097839 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.099771 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.099965 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.100361 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.100967 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.101544 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-47lf7"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.101970 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.102275 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.102672 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.104035 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.111039 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.118537 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.119531 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lj6tc"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.119692 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120150 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120349 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qm2kf"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83f9314e-459c-4866-830b-80e171b696dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-phxph\" (UID: \"83f9314e-459c-4866-830b-80e171b696dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/520428c8-dcc3-4649-8b61-346774136b38-bound-sa-token\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-dir\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f888dde-fb26-45b0-a084-f63a3c99bf50-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120477 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57fc07c3-9f4a-4494-8a45-04efba3c358c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120502 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120532 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-default-certificate\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120551 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzbm\" (UniqueName: \"kubernetes.io/projected/ffa8e999-61a7-4b74-8d0d-74652418374b-kube-api-access-zxzbm\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120569 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swzsm\" (UniqueName: \"kubernetes.io/projected/bdefa569-260f-4b10-a611-ab781c4fea72-kube-api-access-swzsm\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120593 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nrfw\" (UniqueName: \"kubernetes.io/projected/fe4fc787-106a-4c35-ba6c-74a64a69db9f-kube-api-access-2nrfw\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/56758c2e-648d-41fe-8758-439f0070d150-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120641 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kzsv\" (UniqueName: \"kubernetes.io/projected/d2206ca5-861f-4a89-8218-8c8a1264b2d8-kube-api-access-8kzsv\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57fc07c3-9f4a-4494-8a45-04efba3c358c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120678 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f888dde-fb26-45b0-a084-f63a3c99bf50-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120750 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktxq\" (UniqueName: \"kubernetes.io/projected/eb856235-9997-4747-9a24-fca2600a68d9-kube-api-access-gktxq\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdefa569-260f-4b10-a611-ab781c4fea72-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120788 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-policies\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37a6cccd-b5a0-42bd-b580-3fe07356b864-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/520428c8-dcc3-4649-8b61-346774136b38-metrics-tls\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ecef83c-1f5a-4db1-93a9-a18476653e8a-trusted-ca\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120892 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ffa8e999-61a7-4b74-8d0d-74652418374b-auth-proxy-config\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120915 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86542c4e-7f2b-4933-9d0d-737228524851-serving-cert\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.120939 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.122322 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c5sr8"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.123249 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.124940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ecef83c-1f5a-4db1-93a9-a18476653e8a-serving-cert\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125046 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125072 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-metrics-certs\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125111 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56758c2e-648d-41fe-8758-439f0070d150-config\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125198 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9qk9\" (UniqueName: \"kubernetes.io/projected/83f9314e-459c-4866-830b-80e171b696dd-kube-api-access-j9qk9\") pod \"multus-admission-controller-857f4d67dd-phxph\" (UID: \"83f9314e-459c-4866-830b-80e171b696dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125307 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125357 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-config\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125381 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ffa8e999-61a7-4b74-8d0d-74652418374b-machine-approver-tls\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdefa569-260f-4b10-a611-ab781c4fea72-serving-cert\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125435 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvjdb\" (UniqueName: \"kubernetes.io/projected/4f01ee61-ced4-422e-8397-6b7f905d8d57-kube-api-access-bvjdb\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125474 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a6cccd-b5a0-42bd-b580-3fe07356b864-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a6cccd-b5a0-42bd-b580-3fe07356b864-config\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb856235-9997-4747-9a24-fca2600a68d9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125544 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125561 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125595 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f01ee61-ced4-422e-8397-6b7f905d8d57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125614 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c999g\" (UniqueName: \"kubernetes.io/projected/3f6219e3-5e33-4c13-9c52-de506ba0b6a6-kube-api-access-c999g\") pod \"downloads-7954f5f757-8cf52\" (UID: \"3f6219e3-5e33-4c13-9c52-de506ba0b6a6\") " pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125635 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d2206ca5-861f-4a89-8218-8c8a1264b2d8-images\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecef83c-1f5a-4db1-93a9-a18476653e8a-config\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa8e999-61a7-4b74-8d0d-74652418374b-config\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/520428c8-dcc3-4649-8b61-346774136b38-trusted-ca\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-stats-auth\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb856235-9997-4747-9a24-fca2600a68d9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125811 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvvs\" (UniqueName: \"kubernetes.io/projected/5c73e02c-cb77-47e2-bf8a-1092ada428d5-kube-api-access-9vvvs\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125846 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9ggc\" (UniqueName: \"kubernetes.io/projected/dbd91a09-921f-4585-986a-90fd4a111781-kube-api-access-f9ggc\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125895 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2206ca5-861f-4a89-8218-8c8a1264b2d8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.125972 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7dqj\" (UniqueName: \"kubernetes.io/projected/56758c2e-648d-41fe-8758-439f0070d150-kube-api-access-l7dqj\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126006 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126046 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56758c2e-648d-41fe-8758-439f0070d150-images\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126077 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7tf\" (UniqueName: \"kubernetes.io/projected/520428c8-dcc3-4649-8b61-346774136b38-kube-api-access-dw7tf\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5g8g\" (UniqueName: \"kubernetes.io/projected/0ecef83c-1f5a-4db1-93a9-a18476653e8a-kube-api-access-k5g8g\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126134 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126154 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd91a09-921f-4585-986a-90fd4a111781-service-ca-bundle\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57fc07c3-9f4a-4494-8a45-04efba3c358c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126192 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65fg\" (UniqueName: \"kubernetes.io/projected/86542c4e-7f2b-4933-9d0d-737228524851-kube-api-access-f65fg\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe4fc787-106a-4c35-ba6c-74a64a69db9f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xh4\" (UniqueName: \"kubernetes.io/projected/2f888dde-fb26-45b0-a084-f63a3c99bf50-kube-api-access-h5xh4\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4fc787-106a-4c35-ba6c-74a64a69db9f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126293 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4fc787-106a-4c35-ba6c-74a64a69db9f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2206ca5-861f-4a89-8218-8c8a1264b2d8-proxy-tls\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126362 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f01ee61-ced4-422e-8397-6b7f905d8d57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.126383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-client-ca\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.133889 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.137048 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.138377 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.139072 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.166447 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.167469 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.169013 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.170083 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.175582 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.170560 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.175172 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.177665 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.178171 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-84tzn"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.180240 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.182083 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.185963 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hplrj"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.186653 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.187167 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.187564 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.187874 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.188077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.188641 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.190151 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8q2r4"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.190638 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.191135 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.191582 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.192123 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.193077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.193586 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.194055 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.194598 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8vn67"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.195087 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.195975 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ltlhm"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.197584 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bxq5v"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.199059 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.200438 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b4752"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.201663 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.202766 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.203014 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-st74w"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.204498 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.205406 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.206450 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.207918 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-76zmn"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.208337 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.209477 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.210781 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.212188 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8cf52"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.213370 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dh4pw"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.214458 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bsjn6"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.215058 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.215379 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.215815 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lj6tc"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.216903 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.218663 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qm2kf"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.219721 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.221667 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hplrj"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.223219 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqr6g"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.224794 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dbpvk"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228031 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2rv95"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228199 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56758c2e-648d-41fe-8758-439f0070d150-images\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7tf\" (UniqueName: \"kubernetes.io/projected/520428c8-dcc3-4649-8b61-346774136b38-kube-api-access-dw7tf\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5g8g\" (UniqueName: \"kubernetes.io/projected/0ecef83c-1f5a-4db1-93a9-a18476653e8a-kube-api-access-k5g8g\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd91a09-921f-4585-986a-90fd4a111781-service-ca-bundle\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f65fg\" (UniqueName: \"kubernetes.io/projected/86542c4e-7f2b-4933-9d0d-737228524851-kube-api-access-f65fg\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228380 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe4fc787-106a-4c35-ba6c-74a64a69db9f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228421 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xh4\" (UniqueName: \"kubernetes.io/projected/2f888dde-fb26-45b0-a084-f63a3c99bf50-kube-api-access-h5xh4\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57fc07c3-9f4a-4494-8a45-04efba3c358c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228464 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2206ca5-861f-4a89-8218-8c8a1264b2d8-proxy-tls\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f01ee61-ced4-422e-8397-6b7f905d8d57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-client-ca\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4fc787-106a-4c35-ba6c-74a64a69db9f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228582 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4fc787-106a-4c35-ba6c-74a64a69db9f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d105add5-0618-41cf-ae5d-59b739694e4c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83f9314e-459c-4866-830b-80e171b696dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-phxph\" (UID: \"83f9314e-459c-4866-830b-80e171b696dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228677 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/520428c8-dcc3-4649-8b61-346774136b38-bound-sa-token\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-dir\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228745 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f888dde-fb26-45b0-a084-f63a3c99bf50-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57fc07c3-9f4a-4494-8a45-04efba3c358c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228786 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-default-certificate\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228881 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-config\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228903 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-serving-cert\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nrfw\" (UniqueName: \"kubernetes.io/projected/fe4fc787-106a-4c35-ba6c-74a64a69db9f-kube-api-access-2nrfw\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzbm\" (UniqueName: \"kubernetes.io/projected/ffa8e999-61a7-4b74-8d0d-74652418374b-kube-api-access-zxzbm\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swzsm\" (UniqueName: \"kubernetes.io/projected/bdefa569-260f-4b10-a611-ab781c4fea72-kube-api-access-swzsm\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229007 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/56758c2e-648d-41fe-8758-439f0070d150-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kzsv\" (UniqueName: \"kubernetes.io/projected/d2206ca5-861f-4a89-8218-8c8a1264b2d8-kube-api-access-8kzsv\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57fc07c3-9f4a-4494-8a45-04efba3c358c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229094 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f888dde-fb26-45b0-a084-f63a3c99bf50-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gktxq\" (UniqueName: \"kubernetes.io/projected/eb856235-9997-4747-9a24-fca2600a68d9-kube-api-access-gktxq\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37a6cccd-b5a0-42bd-b580-3fe07356b864-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229253 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/520428c8-dcc3-4649-8b61-346774136b38-metrics-tls\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdefa569-260f-4b10-a611-ab781c4fea72-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-policies\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229401 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/290a3ba3-ff12-4724-9cdb-c41dab7b0827-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229406 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229439 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-client-ca\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.228198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ffa8e999-61a7-4b74-8d0d-74652418374b-auth-proxy-config\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ecef83c-1f5a-4db1-93a9-a18476653e8a-trusted-ca\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229504 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd91a09-921f-4585-986a-90fd4a111781-service-ca-bundle\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229526 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86542c4e-7f2b-4933-9d0d-737228524851-serving-cert\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229583 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ecef83c-1f5a-4db1-93a9-a18476653e8a-serving-cert\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/290a3ba3-ff12-4724-9cdb-c41dab7b0827-proxy-tls\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229653 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chz4k\" (UniqueName: \"kubernetes.io/projected/d105add5-0618-41cf-ae5d-59b739694e4c-kube-api-access-chz4k\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56758c2e-648d-41fe-8758-439f0070d150-config\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229740 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9qk9\" (UniqueName: \"kubernetes.io/projected/83f9314e-459c-4866-830b-80e171b696dd-kube-api-access-j9qk9\") pod \"multus-admission-controller-857f4d67dd-phxph\" (UID: \"83f9314e-459c-4866-830b-80e171b696dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-metrics-certs\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-service-ca-bundle\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229921 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56758c2e-648d-41fe-8758-439f0070d150-images\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229954 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7fz\" (UniqueName: \"kubernetes.io/projected/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-kube-api-access-hl7fz\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.229981 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-config\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230048 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ffa8e999-61a7-4b74-8d0d-74652418374b-machine-approver-tls\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdefa569-260f-4b10-a611-ab781c4fea72-serving-cert\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjdb\" (UniqueName: \"kubernetes.io/projected/4f01ee61-ced4-422e-8397-6b7f905d8d57-kube-api-access-bvjdb\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230126 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-config\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230152 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230176 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdgv\" (UniqueName: \"kubernetes.io/projected/290a3ba3-ff12-4724-9cdb-c41dab7b0827-kube-api-access-rcdgv\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb856235-9997-4747-9a24-fca2600a68d9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f01ee61-ced4-422e-8397-6b7f905d8d57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c999g\" (UniqueName: \"kubernetes.io/projected/3f6219e3-5e33-4c13-9c52-de506ba0b6a6-kube-api-access-c999g\") pod \"downloads-7954f5f757-8cf52\" (UID: \"3f6219e3-5e33-4c13-9c52-de506ba0b6a6\") " pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a6cccd-b5a0-42bd-b580-3fe07356b864-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a6cccd-b5a0-42bd-b580-3fe07356b864-config\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230363 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d2206ca5-861f-4a89-8218-8c8a1264b2d8-images\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230382 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecef83c-1f5a-4db1-93a9-a18476653e8a-config\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-stats-auth\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa8e999-61a7-4b74-8d0d-74652418374b-config\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/520428c8-dcc3-4649-8b61-346774136b38-trusted-ca\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb856235-9997-4747-9a24-fca2600a68d9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvvs\" (UniqueName: \"kubernetes.io/projected/5c73e02c-cb77-47e2-bf8a-1092ada428d5-kube-api-access-9vvvs\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9ggc\" (UniqueName: \"kubernetes.io/projected/dbd91a09-921f-4585-986a-90fd4a111781-kube-api-access-f9ggc\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2206ca5-861f-4a89-8218-8c8a1264b2d8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230684 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7dqj\" (UniqueName: \"kubernetes.io/projected/56758c2e-648d-41fe-8758-439f0070d150-kube-api-access-l7dqj\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.230731 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56758c2e-648d-41fe-8758-439f0070d150-config\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.231490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.232131 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.232963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f01ee61-ced4-422e-8397-6b7f905d8d57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.233739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-dir\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.235060 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-config\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.235425 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.236966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ecef83c-1f5a-4db1-93a9-a18476653e8a-serving-cert\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.237242 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57fc07c3-9f4a-4494-8a45-04efba3c358c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.237549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ffa8e999-61a7-4b74-8d0d-74652418374b-auth-proxy-config\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.237654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2206ca5-861f-4a89-8218-8c8a1264b2d8-proxy-tls\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.238133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2206ca5-861f-4a89-8218-8c8a1264b2d8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.238246 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-phxph"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.238485 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86542c4e-7f2b-4933-9d0d-737228524851-serving-cert\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.238661 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-client-ca\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.239078 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe4fc787-106a-4c35-ba6c-74a64a69db9f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.241585 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.242075 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb856235-9997-4747-9a24-fca2600a68d9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.242148 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/56758c2e-648d-41fe-8758-439f0070d150-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.242517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-policies\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.242796 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.242996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa8e999-61a7-4b74-8d0d-74652418374b-config\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.243188 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.243263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecef83c-1f5a-4db1-93a9-a18476653e8a-config\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.243334 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f888dde-fb26-45b0-a084-f63a3c99bf50-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.243377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb856235-9997-4747-9a24-fca2600a68d9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.244560 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ecef83c-1f5a-4db1-93a9-a18476653e8a-trusted-ca\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.244580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57fc07c3-9f4a-4494-8a45-04efba3c358c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.244563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.244937 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/520428c8-dcc3-4649-8b61-346774136b38-trusted-ca\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.245355 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/520428c8-dcc3-4649-8b61-346774136b38-metrics-tls\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.245574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.245646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d2206ca5-861f-4a89-8218-8c8a1264b2d8-images\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.245722 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.246079 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.246171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bdefa569-260f-4b10-a611-ab781c4fea72-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.246195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-metrics-certs\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.246498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdefa569-260f-4b10-a611-ab781c4fea72-serving-cert\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.246954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ffa8e999-61a7-4b74-8d0d-74652418374b-machine-approver-tls\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.247286 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/83f9314e-459c-4866-830b-80e171b696dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-phxph\" (UID: \"83f9314e-459c-4866-830b-80e171b696dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.248026 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.248427 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.248697 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a6cccd-b5a0-42bd-b580-3fe07356b864-config\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.248989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe4fc787-106a-4c35-ba6c-74a64a69db9f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.249332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f888dde-fb26-45b0-a084-f63a3c99bf50-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.250321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f01ee61-ced4-422e-8397-6b7f905d8d57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.250460 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-default-certificate\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.252980 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.254498 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-47lf7"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.255412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dbd91a09-921f-4585-986a-90fd4a111781-stats-auth\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.256140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.256612 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a6cccd-b5a0-42bd-b580-3fe07356b864-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.256751 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.257331 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.260766 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.261125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.262082 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.263509 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.264659 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.265753 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l7kgh"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.268288 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c5sr8"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.270853 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.271081 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.272614 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8q2r4"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.274553 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.276259 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8vn67"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.279226 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dh4pw"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.280299 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2rv95"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.281452 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.282664 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dbpvk"] Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.289149 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.308566 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.329758 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.331517 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d105add5-0618-41cf-ae5d-59b739694e4c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.331568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-config\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.331592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-serving-cert\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.331967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/290a3ba3-ff12-4724-9cdb-c41dab7b0827-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.333173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/290a3ba3-ff12-4724-9cdb-c41dab7b0827-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.333242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-client-ca\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.333284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/290a3ba3-ff12-4724-9cdb-c41dab7b0827-proxy-tls\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.333313 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chz4k\" (UniqueName: \"kubernetes.io/projected/d105add5-0618-41cf-ae5d-59b739694e4c-kube-api-access-chz4k\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.334232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-service-ca-bundle\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.334297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7fz\" (UniqueName: \"kubernetes.io/projected/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-kube-api-access-hl7fz\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.334341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.334371 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-config\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.334414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdgv\" (UniqueName: \"kubernetes.io/projected/290a3ba3-ff12-4724-9cdb-c41dab7b0827-kube-api-access-rcdgv\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.334899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-client-ca\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.335344 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-config\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.336474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d105add5-0618-41cf-ae5d-59b739694e4c-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.354306 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.368424 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.388232 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.428398 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.448849 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.469424 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.487831 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.508171 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.516329 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-service-ca-bundle\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.528305 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.549340 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.568234 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.574972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-serving-cert\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.588495 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.593681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-config\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.615107 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.626598 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.628699 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.648989 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.668698 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.688672 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.733568 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.735103 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.749062 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.768685 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.787986 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.819814 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.829332 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.851935 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.868972 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.878375 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/290a3ba3-ff12-4724-9cdb-c41dab7b0827-proxy-tls\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.890038 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.931911 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.949494 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.969463 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 21 21:48:45 crc kubenswrapper[4717]: I0221 21:48:45.989645 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.009645 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.029163 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.049604 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.069067 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.089305 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.109646 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.129124 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.149562 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.169440 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.190300 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.207040 4717 request.go:700] Waited for 1.018705695s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dsigning-key&limit=500&resourceVersion=0 Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.209798 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.230233 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.249576 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.269807 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.289711 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.309531 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.329778 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.360017 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.368658 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.390293 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.409836 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.428765 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.449077 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.469115 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.489266 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.509216 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.529193 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.548970 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.570051 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.589496 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.610072 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.629338 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.649089 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.668856 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.689656 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.710636 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.728753 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.748578 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.769013 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.788899 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.809952 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.829633 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.849141 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.869974 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.890334 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.937299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7tf\" (UniqueName: \"kubernetes.io/projected/520428c8-dcc3-4649-8b61-346774136b38-kube-api-access-dw7tf\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.958230 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5g8g\" (UniqueName: \"kubernetes.io/projected/0ecef83c-1f5a-4db1-93a9-a18476653e8a-kube-api-access-k5g8g\") pod \"console-operator-58897d9998-bxq5v\" (UID: \"0ecef83c-1f5a-4db1-93a9-a18476653e8a\") " pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.979834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.980203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65fg\" (UniqueName: \"kubernetes.io/projected/86542c4e-7f2b-4933-9d0d-737228524851-kube-api-access-f65fg\") pod \"controller-manager-879f6c89f-ltlhm\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.990043 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 21 21:48:46 crc kubenswrapper[4717]: I0221 21:48:46.996729 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57fc07c3-9f4a-4494-8a45-04efba3c358c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dk9hn\" (UID: \"57fc07c3-9f4a-4494-8a45-04efba3c358c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.009751 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.029338 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.049604 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.068733 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.090589 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.108974 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.163774 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzbm\" (UniqueName: \"kubernetes.io/projected/ffa8e999-61a7-4b74-8d0d-74652418374b-kube-api-access-zxzbm\") pod \"machine-approver-56656f9798-27zzk\" (UID: \"ffa8e999-61a7-4b74-8d0d-74652418374b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.178092 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.191934 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7dqj\" (UniqueName: \"kubernetes.io/projected/56758c2e-648d-41fe-8758-439f0070d150-kube-api-access-l7dqj\") pod \"machine-api-operator-5694c8668f-84tzn\" (UID: \"56758c2e-648d-41fe-8758-439f0070d150\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.192643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktxq\" (UniqueName: \"kubernetes.io/projected/eb856235-9997-4747-9a24-fca2600a68d9-kube-api-access-gktxq\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nrfv\" (UID: \"eb856235-9997-4747-9a24-fca2600a68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.201504 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.219808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9qk9\" (UniqueName: \"kubernetes.io/projected/83f9314e-459c-4866-830b-80e171b696dd-kube-api-access-j9qk9\") pod \"multus-admission-controller-857f4d67dd-phxph\" (UID: \"83f9314e-459c-4866-830b-80e171b696dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.227079 4717 request.go:700] Waited for 1.994556792s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.241839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swzsm\" (UniqueName: \"kubernetes.io/projected/bdefa569-260f-4b10-a611-ab781c4fea72-kube-api-access-swzsm\") pod \"openshift-config-operator-7777fb866f-b4752\" (UID: \"bdefa569-260f-4b10-a611-ab781c4fea72\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.253070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjdb\" (UniqueName: \"kubernetes.io/projected/4f01ee61-ced4-422e-8397-6b7f905d8d57-kube-api-access-bvjdb\") pod \"openshift-apiserver-operator-796bbdcf4f-zzd8r\" (UID: \"4f01ee61-ced4-422e-8397-6b7f905d8d57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.266046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.270234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe4fc787-106a-4c35-ba6c-74a64a69db9f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.281247 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bxq5v"] Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.289393 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37a6cccd-b5a0-42bd-b580-3fe07356b864-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c4kcz\" (UID: \"37a6cccd-b5a0-42bd-b580-3fe07356b864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.289768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.309050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9ggc\" (UniqueName: \"kubernetes.io/projected/dbd91a09-921f-4585-986a-90fd4a111781-kube-api-access-f9ggc\") pod \"router-default-5444994796-g8wzz\" (UID: \"dbd91a09-921f-4585-986a-90fd4a111781\") " pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.330486 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xh4\" (UniqueName: \"kubernetes.io/projected/2f888dde-fb26-45b0-a084-f63a3c99bf50-kube-api-access-h5xh4\") pod \"kube-storage-version-migrator-operator-b67b599dd-z7hgg\" (UID: \"2f888dde-fb26-45b0-a084-f63a3c99bf50\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.333953 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.350442 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/520428c8-dcc3-4649-8b61-346774136b38-bound-sa-token\") pod \"ingress-operator-5b745b69d9-st74w\" (UID: \"520428c8-dcc3-4649-8b61-346774136b38\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.368962 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c999g\" (UniqueName: \"kubernetes.io/projected/3f6219e3-5e33-4c13-9c52-de506ba0b6a6-kube-api-access-c999g\") pod \"downloads-7954f5f757-8cf52\" (UID: \"3f6219e3-5e33-4c13-9c52-de506ba0b6a6\") " pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.387543 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kzsv\" (UniqueName: \"kubernetes.io/projected/d2206ca5-861f-4a89-8218-8c8a1264b2d8-kube-api-access-8kzsv\") pod \"machine-config-operator-74547568cd-p6d8p\" (UID: \"d2206ca5-861f-4a89-8218-8c8a1264b2d8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.393337 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.399441 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.408375 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvvs\" (UniqueName: \"kubernetes.io/projected/5c73e02c-cb77-47e2-bf8a-1092ada428d5-kube-api-access-9vvvs\") pod \"oauth-openshift-558db77b4-76zmn\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.413463 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.420313 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.432260 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.436502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nrfw\" (UniqueName: \"kubernetes.io/projected/fe4fc787-106a-4c35-ba6c-74a64a69db9f-kube-api-access-2nrfw\") pod \"cluster-image-registry-operator-dc59b4c8b-5tplh\" (UID: \"fe4fc787-106a-4c35-ba6c-74a64a69db9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.437483 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ltlhm"] Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.453107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chz4k\" (UniqueName: \"kubernetes.io/projected/d105add5-0618-41cf-ae5d-59b739694e4c-kube-api-access-chz4k\") pod \"route-controller-manager-6576b87f9c-qrbz9\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.466471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7fz\" (UniqueName: \"kubernetes.io/projected/907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30-kube-api-access-hl7fz\") pod \"authentication-operator-69f744f599-qm2kf\" (UID: \"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.476565 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.488205 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdgv\" (UniqueName: \"kubernetes.io/projected/290a3ba3-ff12-4724-9cdb-c41dab7b0827-kube-api-access-rcdgv\") pod \"machine-config-controller-84d6567774-bb9bf\" (UID: \"290a3ba3-ff12-4724-9cdb-c41dab7b0827\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.495125 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.516493 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-config\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575139 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-serving-cert\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278b8\" (UniqueName: \"kubernetes.io/projected/17bb07e6-67dd-4cc5-b979-9ef794228e81-kube-api-access-278b8\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575192 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jllh4\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-kube-api-access-jllh4\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575236 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-bound-sa-token\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-certificates\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575275 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-oauth-serving-cert\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575292 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-encryption-config\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575345 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/72d1d285-af56-462a-80e0-985f8b689b10-audit-dir\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575395 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltrd\" (UniqueName: \"kubernetes.io/projected/c6f53bb9-51b7-4a4b-b348-da4598ceccbc-kube-api-access-4ltrd\") pod \"dns-operator-744455d44c-lj6tc\" (UID: \"c6f53bb9-51b7-4a4b-b348-da4598ceccbc\") " pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4fde7d3-808e-437a-a4b1-a3f44570ba55-config\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575453 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-config\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72ebb725-29ae-4902-9b6b-6258039bb6c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575489 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-etcd-client\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575528 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-oauth-config\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575572 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glv2m\" (UniqueName: \"kubernetes.io/projected/6521b257-6f92-42e5-be73-72c45ecfc58a-kube-api-access-glv2m\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575789 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-service-ca\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575820 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpk6\" (UniqueName: \"kubernetes.io/projected/72d1d285-af56-462a-80e0-985f8b689b10-kube-api-access-znpk6\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521b257-6f92-42e5-be73-72c45ecfc58a-serving-cert\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575894 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-serving-cert\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-client\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.575994 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-tls\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576054 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-etcd-serving-ca\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576077 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-service-ca\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576097 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-trusted-ca-bundle\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-audit\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4fde7d3-808e-437a-a4b1-a3f44570ba55-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-trusted-ca\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576166 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-image-import-ca\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576234 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4fde7d3-808e-437a-a4b1-a3f44570ba55-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576253 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/72d1d285-af56-462a-80e0-985f8b689b10-node-pullsecrets\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576270 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-config\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72ebb725-29ae-4902-9b6b-6258039bb6c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6f53bb9-51b7-4a4b-b348-da4598ceccbc-metrics-tls\") pod \"dns-operator-744455d44c-lj6tc\" (UID: \"c6f53bb9-51b7-4a4b-b348-da4598ceccbc\") " pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576400 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.576427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-ca\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: E0221 21:48:47.582488 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.082467172 +0000 UTC m=+142.864000794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.590910 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r"] Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.604335 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.608375 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.621422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.624783 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678098 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5w9\" (UniqueName: \"kubernetes.io/projected/ff47caaa-cf37-40ea-8c6c-457189a5432b-kube-api-access-jk5w9\") pod \"control-plane-machine-set-operator-78cbb6b69f-r68hh\" (UID: \"ff47caaa-cf37-40ea-8c6c-457189a5432b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e172f7a2-4889-47d8-b0b1-a04114d0e328-certs\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678417 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-plugins-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678449 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07887dc7-0606-470d-9c98-fa209d695e60-config-volume\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/450eb855-2d6d-4503-9a7e-1980d8e97346-secret-volume\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678500 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-serving-cert\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678516 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mzl\" (UniqueName: \"kubernetes.io/projected/e561b490-ea66-4906-9f8b-299a7e5909cf-kube-api-access-28mzl\") pod \"ingress-canary-2rv95\" (UID: \"e561b490-ea66-4906-9f8b-299a7e5909cf\") " pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678533 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2e4f85de-f15f-40b8-9bfe-914862b6c20e-srv-cert\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678726 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-mountpoint-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.678917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-socket-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: E0221 21:48:47.679085 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.179059411 +0000 UTC m=+142.960593033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.679212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-client\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.679243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff47caaa-cf37-40ea-8c6c-457189a5432b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r68hh\" (UID: \"ff47caaa-cf37-40ea-8c6c-457189a5432b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.679290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e172f7a2-4889-47d8-b0b1-a04114d0e328-node-bootstrap-token\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.679392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6h28\" (UniqueName: \"kubernetes.io/projected/60d25c82-47d6-4706-8235-70fd592a984d-kube-api-access-p6h28\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-tls\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680289 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2e4f85de-f15f-40b8-9bfe-914862b6c20e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680456 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882d3e28-3823-42b6-8973-cd07043ad24d-config\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680659 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-service-ca\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680683 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-etcd-serving-ca\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680730 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-trusted-ca-bundle\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-audit\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680828 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4fde7d3-808e-437a-a4b1-a3f44570ba55-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680905 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/395e9ebb-c913-43df-bab9-f4e181167728-profile-collector-cert\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680940 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-trusted-ca\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-image-import-ca\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.680988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/450eb855-2d6d-4503-9a7e-1980d8e97346-config-volume\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-registration-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681060 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4fde7d3-808e-437a-a4b1-a3f44570ba55-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681082 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzf42\" (UniqueName: \"kubernetes.io/projected/2e4f85de-f15f-40b8-9bfe-914862b6c20e-kube-api-access-jzf42\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681102 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/72d1d285-af56-462a-80e0-985f8b689b10-node-pullsecrets\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681120 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-config\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72ebb725-29ae-4902-9b6b-6258039bb6c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6f53bb9-51b7-4a4b-b348-da4598ceccbc-metrics-tls\") pod \"dns-operator-744455d44c-lj6tc\" (UID: \"c6f53bb9-51b7-4a4b-b348-da4598ceccbc\") " pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-audit-policies\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/369bb866-95f1-42db-8e35-9e89b6f0d157-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6sxz\" (UID: \"369bb866-95f1-42db-8e35-9e89b6f0d157\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681299 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9315435-1144-473c-9d20-8a0ea85d1199-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6p5kv\" (UID: \"a9315435-1144-473c-9d20-8a0ea85d1199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-ca\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-config\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681387 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-webhook-cert\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681407 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/395e9ebb-c913-43df-bab9-f4e181167728-srv-cert\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681424 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-serving-cert\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjwlg\" (UniqueName: \"kubernetes.io/projected/c6a85015-ccec-4075-8749-fc07bbea7344-kube-api-access-kjwlg\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5zmr\" (UniqueName: \"kubernetes.io/projected/07887dc7-0606-470d-9c98-fa209d695e60-kube-api-access-b5zmr\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681476 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-278b8\" (UniqueName: \"kubernetes.io/projected/17bb07e6-67dd-4cc5-b979-9ef794228e81-kube-api-access-278b8\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681510 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681533 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jllh4\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-kube-api-access-jllh4\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681550 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj29z\" (UniqueName: \"kubernetes.io/projected/a9315435-1144-473c-9d20-8a0ea85d1199-kube-api-access-bj29z\") pod \"package-server-manager-789f6589d5-6p5kv\" (UID: \"a9315435-1144-473c-9d20-8a0ea85d1199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681568 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-csi-data-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681585 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhj8\" (UniqueName: \"kubernetes.io/projected/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-kube-api-access-slhj8\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681604 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llh9n\" (UniqueName: \"kubernetes.io/projected/e172f7a2-4889-47d8-b0b1-a04114d0e328-kube-api-access-llh9n\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681632 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-encryption-config\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-bound-sa-token\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681709 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07887dc7-0606-470d-9c98-fa209d695e60-metrics-tls\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681727 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5778v\" (UniqueName: \"kubernetes.io/projected/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-kube-api-access-5778v\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681754 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-certificates\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681771 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-oauth-serving-cert\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/87db50f7-cf1f-4f32-b6f4-4f641261c731-signing-key\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681811 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-encryption-config\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681954 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p77t\" (UniqueName: \"kubernetes.io/projected/882d3e28-3823-42b6-8973-cd07043ad24d-kube-api-access-9p77t\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nc5\" (UniqueName: \"kubernetes.io/projected/5d75838b-1762-4f29-89be-817b97eb22f0-kube-api-access-m7nc5\") pod \"migrator-59844c95c7-9j4nq\" (UID: \"5d75838b-1762-4f29-89be-817b97eb22f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.681997 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/72d1d285-af56-462a-80e0-985f8b689b10-audit-dir\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682013 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-etcd-client\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682032 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltrd\" (UniqueName: \"kubernetes.io/projected/c6f53bb9-51b7-4a4b-b348-da4598ceccbc-kube-api-access-4ltrd\") pod \"dns-operator-744455d44c-lj6tc\" (UID: \"c6f53bb9-51b7-4a4b-b348-da4598ceccbc\") " pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4fde7d3-808e-437a-a4b1-a3f44570ba55-config\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682091 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e561b490-ea66-4906-9f8b-299a7e5909cf-cert\") pod \"ingress-canary-2rv95\" (UID: \"e561b490-ea66-4906-9f8b-299a7e5909cf\") " pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-config\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-tmpfs\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72ebb725-29ae-4902-9b6b-6258039bb6c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682198 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-etcd-client\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682214 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phcjn\" (UniqueName: \"kubernetes.io/projected/450eb855-2d6d-4503-9a7e-1980d8e97346-kube-api-access-phcjn\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882d3e28-3823-42b6-8973-cd07043ad24d-serving-cert\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682248 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/87db50f7-cf1f-4f32-b6f4-4f641261c731-signing-cabundle\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-oauth-config\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682299 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9md\" (UniqueName: \"kubernetes.io/projected/87db50f7-cf1f-4f32-b6f4-4f641261c731-kube-api-access-nf9md\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682331 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glv2m\" (UniqueName: \"kubernetes.io/projected/6521b257-6f92-42e5-be73-72c45ecfc58a-kube-api-access-glv2m\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbchv\" (UniqueName: \"kubernetes.io/projected/395e9ebb-c913-43df-bab9-f4e181167728-kube-api-access-nbchv\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682369 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682389 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682428 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-serving-cert\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682446 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6a85015-ccec-4075-8749-fc07bbea7344-audit-dir\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-service-ca\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpk6\" (UniqueName: \"kubernetes.io/projected/72d1d285-af56-462a-80e0-985f8b689b10-kube-api-access-znpk6\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521b257-6f92-42e5-be73-72c45ecfc58a-serving-cert\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.682530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcvc\" (UniqueName: \"kubernetes.io/projected/369bb866-95f1-42db-8e35-9e89b6f0d157-kube-api-access-nmcvc\") pod \"cluster-samples-operator-665b6dd947-p6sxz\" (UID: \"369bb866-95f1-42db-8e35-9e89b6f0d157\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.684795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-client\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.686686 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-etcd-serving-ca\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.705389 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-tls\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.706339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-service-ca\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.708243 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-serving-cert\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.708472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-image-import-ca\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.708920 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/72d1d285-af56-462a-80e0-985f8b689b10-node-pullsecrets\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.709438 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-oauth-serving-cert\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.709555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-config\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.709822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72ebb725-29ae-4902-9b6b-6258039bb6c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.710980 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-ca\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: E0221 21:48:47.711380 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.211354046 +0000 UTC m=+142.992887658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.712138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-config\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.713037 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-trusted-ca\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.714051 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-config\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.714551 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-certificates\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.714546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/72d1d285-af56-462a-80e0-985f8b689b10-audit-dir\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.715797 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4fde7d3-808e-437a-a4b1-a3f44570ba55-config\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.717900 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.719840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6521b257-6f92-42e5-be73-72c45ecfc58a-etcd-service-ca\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.720808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.722402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-trusted-ca-bundle\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.723906 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/72d1d285-af56-462a-80e0-985f8b689b10-audit\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.726767 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72ebb725-29ae-4902-9b6b-6258039bb6c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.728561 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-serving-cert\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.732468 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6521b257-6f92-42e5-be73-72c45ecfc58a-serving-cert\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.733295 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6f53bb9-51b7-4a4b-b348-da4598ceccbc-metrics-tls\") pod \"dns-operator-744455d44c-lj6tc\" (UID: \"c6f53bb9-51b7-4a4b-b348-da4598ceccbc\") " pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.733845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4fde7d3-808e-437a-a4b1-a3f44570ba55-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.734325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-encryption-config\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.734705 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-bound-sa-token\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.734882 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/72d1d285-af56-462a-80e0-985f8b689b10-etcd-client\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.743112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-oauth-config\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.750032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4fde7d3-808e-437a-a4b1-a3f44570ba55-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-pq4lv\" (UID: \"d4fde7d3-808e-437a-a4b1-a3f44570ba55\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.751834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.769371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-278b8\" (UniqueName: \"kubernetes.io/projected/17bb07e6-67dd-4cc5-b979-9ef794228e81-kube-api-access-278b8\") pod \"console-f9d7485db-47lf7\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784580 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-encryption-config\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784611 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07887dc7-0606-470d-9c98-fa209d695e60-metrics-tls\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5778v\" (UniqueName: \"kubernetes.io/projected/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-kube-api-access-5778v\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784649 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/87db50f7-cf1f-4f32-b6f4-4f641261c731-signing-key\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784671 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784697 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p77t\" (UniqueName: \"kubernetes.io/projected/882d3e28-3823-42b6-8973-cd07043ad24d-kube-api-access-9p77t\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784718 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nc5\" (UniqueName: \"kubernetes.io/projected/5d75838b-1762-4f29-89be-817b97eb22f0-kube-api-access-m7nc5\") pod \"migrator-59844c95c7-9j4nq\" (UID: \"5d75838b-1762-4f29-89be-817b97eb22f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-etcd-client\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e561b490-ea66-4906-9f8b-299a7e5909cf-cert\") pod \"ingress-canary-2rv95\" (UID: \"e561b490-ea66-4906-9f8b-299a7e5909cf\") " pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-tmpfs\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phcjn\" (UniqueName: \"kubernetes.io/projected/450eb855-2d6d-4503-9a7e-1980d8e97346-kube-api-access-phcjn\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882d3e28-3823-42b6-8973-cd07043ad24d-serving-cert\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784834 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/87db50f7-cf1f-4f32-b6f4-4f641261c731-signing-cabundle\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9md\" (UniqueName: \"kubernetes.io/projected/87db50f7-cf1f-4f32-b6f4-4f641261c731-kube-api-access-nf9md\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbchv\" (UniqueName: \"kubernetes.io/projected/395e9ebb-c913-43df-bab9-f4e181167728-kube-api-access-nbchv\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784914 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784955 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-serving-cert\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.784972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6a85015-ccec-4075-8749-fc07bbea7344-audit-dir\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcvc\" (UniqueName: \"kubernetes.io/projected/369bb866-95f1-42db-8e35-9e89b6f0d157-kube-api-access-nmcvc\") pod \"cluster-samples-operator-665b6dd947-p6sxz\" (UID: \"369bb866-95f1-42db-8e35-9e89b6f0d157\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785021 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5w9\" (UniqueName: \"kubernetes.io/projected/ff47caaa-cf37-40ea-8c6c-457189a5432b-kube-api-access-jk5w9\") pod \"control-plane-machine-set-operator-78cbb6b69f-r68hh\" (UID: \"ff47caaa-cf37-40ea-8c6c-457189a5432b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e172f7a2-4889-47d8-b0b1-a04114d0e328-certs\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-plugins-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785089 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07887dc7-0606-470d-9c98-fa209d695e60-config-volume\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/450eb855-2d6d-4503-9a7e-1980d8e97346-secret-volume\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785124 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mzl\" (UniqueName: \"kubernetes.io/projected/e561b490-ea66-4906-9f8b-299a7e5909cf-kube-api-access-28mzl\") pod \"ingress-canary-2rv95\" (UID: \"e561b490-ea66-4906-9f8b-299a7e5909cf\") " pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2e4f85de-f15f-40b8-9bfe-914862b6c20e-srv-cert\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-mountpoint-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-socket-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785199 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff47caaa-cf37-40ea-8c6c-457189a5432b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r68hh\" (UID: \"ff47caaa-cf37-40ea-8c6c-457189a5432b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e172f7a2-4889-47d8-b0b1-a04114d0e328-node-bootstrap-token\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785237 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6h28\" (UniqueName: \"kubernetes.io/projected/60d25c82-47d6-4706-8235-70fd592a984d-kube-api-access-p6h28\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2e4f85de-f15f-40b8-9bfe-914862b6c20e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882d3e28-3823-42b6-8973-cd07043ad24d-config\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/395e9ebb-c913-43df-bab9-f4e181167728-profile-collector-cert\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/450eb855-2d6d-4503-9a7e-1980d8e97346-config-volume\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-registration-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785358 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzf42\" (UniqueName: \"kubernetes.io/projected/2e4f85de-f15f-40b8-9bfe-914862b6c20e-kube-api-access-jzf42\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785383 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-audit-policies\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785403 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/369bb866-95f1-42db-8e35-9e89b6f0d157-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6sxz\" (UID: \"369bb866-95f1-42db-8e35-9e89b6f0d157\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9315435-1144-473c-9d20-8a0ea85d1199-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6p5kv\" (UID: \"a9315435-1144-473c-9d20-8a0ea85d1199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-webhook-cert\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785464 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/395e9ebb-c913-43df-bab9-f4e181167728-srv-cert\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785491 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjwlg\" (UniqueName: \"kubernetes.io/projected/c6a85015-ccec-4075-8749-fc07bbea7344-kube-api-access-kjwlg\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785569 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5zmr\" (UniqueName: \"kubernetes.io/projected/07887dc7-0606-470d-9c98-fa209d695e60-kube-api-access-b5zmr\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: E0221 21:48:47.785605 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.285584066 +0000 UTC m=+143.067117688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj29z\" (UniqueName: \"kubernetes.io/projected/a9315435-1144-473c-9d20-8a0ea85d1199-kube-api-access-bj29z\") pod \"package-server-manager-789f6589d5-6p5kv\" (UID: \"a9315435-1144-473c-9d20-8a0ea85d1199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785679 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-csi-data-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhj8\" (UniqueName: \"kubernetes.io/projected/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-kube-api-access-slhj8\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.785722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llh9n\" (UniqueName: \"kubernetes.io/projected/e172f7a2-4889-47d8-b0b1-a04114d0e328-kube-api-access-llh9n\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.787034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-tmpfs\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.788349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-csi-data-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.788539 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-mountpoint-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.791227 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-registration-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.791667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-socket-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.792600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-audit-policies\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.794998 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.800792 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.800933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6a85015-ccec-4075-8749-fc07bbea7344-audit-dir\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.801402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07887dc7-0606-470d-9c98-fa209d695e60-config-volume\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.803056 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-encryption-config\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.803332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/87db50f7-cf1f-4f32-b6f4-4f641261c731-signing-cabundle\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.803630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-plugins-dir\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.803718 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882d3e28-3823-42b6-8973-cd07043ad24d-config\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.803720 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e561b490-ea66-4906-9f8b-299a7e5909cf-cert\") pod \"ingress-canary-2rv95\" (UID: \"e561b490-ea66-4906-9f8b-299a7e5909cf\") " pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.804083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/450eb855-2d6d-4503-9a7e-1980d8e97346-config-volume\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.804650 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c6a85015-ccec-4075-8749-fc07bbea7344-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.804784 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltrd\" (UniqueName: \"kubernetes.io/projected/c6f53bb9-51b7-4a4b-b348-da4598ceccbc-kube-api-access-4ltrd\") pod \"dns-operator-744455d44c-lj6tc\" (UID: \"c6f53bb9-51b7-4a4b-b348-da4598ceccbc\") " pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.806826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/395e9ebb-c913-43df-bab9-f4e181167728-srv-cert\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.806908 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882d3e28-3823-42b6-8973-cd07043ad24d-serving-cert\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.807751 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9315435-1144-473c-9d20-8a0ea85d1199-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6p5kv\" (UID: \"a9315435-1144-473c-9d20-8a0ea85d1199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.807926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-etcd-client\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.810772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2e4f85de-f15f-40b8-9bfe-914862b6c20e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.811996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e172f7a2-4889-47d8-b0b1-a04114d0e328-node-bootstrap-token\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.814314 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e172f7a2-4889-47d8-b0b1-a04114d0e328-certs\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.814421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-webhook-cert\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.814455 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6a85015-ccec-4075-8749-fc07bbea7344-serving-cert\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.814896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/450eb855-2d6d-4503-9a7e-1980d8e97346-secret-volume\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.817169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/395e9ebb-c913-43df-bab9-f4e181167728-profile-collector-cert\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.817182 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/369bb866-95f1-42db-8e35-9e89b6f0d157-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6sxz\" (UID: \"369bb866-95f1-42db-8e35-9e89b6f0d157\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.818150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.819118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07887dc7-0606-470d-9c98-fa209d695e60-metrics-tls\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.820693 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2e4f85de-f15f-40b8-9bfe-914862b6c20e-srv-cert\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.822488 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff47caaa-cf37-40ea-8c6c-457189a5432b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r68hh\" (UID: \"ff47caaa-cf37-40ea-8c6c-457189a5432b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.822739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jllh4\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-kube-api-access-jllh4\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.823554 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/87db50f7-cf1f-4f32-b6f4-4f641261c731-signing-key\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.836526 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.859919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glv2m\" (UniqueName: \"kubernetes.io/projected/6521b257-6f92-42e5-be73-72c45ecfc58a-kube-api-access-glv2m\") pod \"etcd-operator-b45778765-l7kgh\" (UID: \"6521b257-6f92-42e5-be73-72c45ecfc58a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.875931 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn"] Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.881764 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpk6\" (UniqueName: \"kubernetes.io/projected/72d1d285-af56-462a-80e0-985f8b689b10-kube-api-access-znpk6\") pod \"apiserver-76f77b778f-c5sr8\" (UID: \"72d1d285-af56-462a-80e0-985f8b689b10\") " pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.889784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:47 crc kubenswrapper[4717]: E0221 21:48:47.890958 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.390937215 +0000 UTC m=+143.172470937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.918143 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phcjn\" (UniqueName: \"kubernetes.io/projected/450eb855-2d6d-4503-9a7e-1980d8e97346-kube-api-access-phcjn\") pod \"collect-profiles-29528505-5wtqv\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.924197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p77t\" (UniqueName: \"kubernetes.io/projected/882d3e28-3823-42b6-8973-cd07043ad24d-kube-api-access-9p77t\") pod \"service-ca-operator-777779d784-8vn67\" (UID: \"882d3e28-3823-42b6-8973-cd07043ad24d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.930708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nc5\" (UniqueName: \"kubernetes.io/projected/5d75838b-1762-4f29-89be-817b97eb22f0-kube-api-access-m7nc5\") pod \"migrator-59844c95c7-9j4nq\" (UID: \"5d75838b-1762-4f29-89be-817b97eb22f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.948171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llh9n\" (UniqueName: \"kubernetes.io/projected/e172f7a2-4889-47d8-b0b1-a04114d0e328-kube-api-access-llh9n\") pod \"machine-config-server-bsjn6\" (UID: \"e172f7a2-4889-47d8-b0b1-a04114d0e328\") " pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.965223 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj29z\" (UniqueName: \"kubernetes.io/projected/a9315435-1144-473c-9d20-8a0ea85d1199-kube-api-access-bj29z\") pod \"package-server-manager-789f6589d5-6p5kv\" (UID: \"a9315435-1144-473c-9d20-8a0ea85d1199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.991761 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:47 crc kubenswrapper[4717]: E0221 21:48:47.992496 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.492472797 +0000 UTC m=+143.274006419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:47 crc kubenswrapper[4717]: I0221 21:48:47.992512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5778v\" (UniqueName: \"kubernetes.io/projected/a26b8bcd-9a3f-4fd0-8d35-2272127096e8-kube-api-access-5778v\") pod \"csi-hostpathplugin-dbpvk\" (UID: \"a26b8bcd-9a3f-4fd0-8d35-2272127096e8\") " pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.007181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhj8\" (UniqueName: \"kubernetes.io/projected/6dcf9089-19f0-421c-83bc-f5ad21bb07a5-kube-api-access-slhj8\") pod \"packageserver-d55dfcdfc-v5mbd\" (UID: \"6dcf9089-19f0-421c-83bc-f5ad21bb07a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.018996 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.030024 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5w9\" (UniqueName: \"kubernetes.io/projected/ff47caaa-cf37-40ea-8c6c-457189a5432b-kube-api-access-jk5w9\") pod \"control-plane-machine-set-operator-78cbb6b69f-r68hh\" (UID: \"ff47caaa-cf37-40ea-8c6c-457189a5432b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.031084 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.034387 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.036938 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.062617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9md\" (UniqueName: \"kubernetes.io/projected/87db50f7-cf1f-4f32-b6f4-4f641261c731-kube-api-access-nf9md\") pod \"service-ca-9c57cc56f-hplrj\" (UID: \"87db50f7-cf1f-4f32-b6f4-4f641261c731\") " pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.069469 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.074917 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbchv\" (UniqueName: \"kubernetes.io/projected/395e9ebb-c913-43df-bab9-f4e181167728-kube-api-access-nbchv\") pod \"catalog-operator-68c6474976-ds4d5\" (UID: \"395e9ebb-c913-43df-bab9-f4e181167728\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.089205 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.094701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.095359 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.595342399 +0000 UTC m=+143.376876021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.104965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzf42\" (UniqueName: \"kubernetes.io/projected/2e4f85de-f15f-40b8-9bfe-914862b6c20e-kube-api-access-jzf42\") pod \"olm-operator-6b444d44fb-zkqdj\" (UID: \"2e4f85de-f15f-40b8-9bfe-914862b6c20e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.119935 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjwlg\" (UniqueName: \"kubernetes.io/projected/c6a85015-ccec-4075-8749-fc07bbea7344-kube-api-access-kjwlg\") pod \"apiserver-7bbb656c7d-cvrbm\" (UID: \"c6a85015-ccec-4075-8749-fc07bbea7344\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.124374 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.140689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5zmr\" (UniqueName: \"kubernetes.io/projected/07887dc7-0606-470d-9c98-fa209d695e60-kube-api-access-b5zmr\") pod \"dns-default-dh4pw\" (UID: \"07887dc7-0606-470d-9c98-fa209d695e60\") " pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.143398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.148827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcvc\" (UniqueName: \"kubernetes.io/projected/369bb866-95f1-42db-8e35-9e89b6f0d157-kube-api-access-nmcvc\") pod \"cluster-samples-operator-665b6dd947-p6sxz\" (UID: \"369bb866-95f1-42db-8e35-9e89b6f0d157\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.149643 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-84tzn"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.153237 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.155326 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.158847 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b4752"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.160291 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.163751 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.163916 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mzl\" (UniqueName: \"kubernetes.io/projected/e561b490-ea66-4906-9f8b-299a7e5909cf-kube-api-access-28mzl\") pod \"ingress-canary-2rv95\" (UID: \"e561b490-ea66-4906-9f8b-299a7e5909cf\") " pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.164333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" event={"ID":"4f01ee61-ced4-422e-8397-6b7f905d8d57","Type":"ContainerStarted","Data":"6f6463cc6aa0b97af2cce16305187e15e795ac72e694e0e399a7c2a7b293ec5f"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.164365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" event={"ID":"4f01ee61-ced4-422e-8397-6b7f905d8d57","Type":"ContainerStarted","Data":"9fafb76a8af1c297399a66207e7a4cb54cb02b4c05856e88692c0c2d9c4bd60c"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.166006 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" event={"ID":"57fc07c3-9f4a-4494-8a45-04efba3c358c","Type":"ContainerStarted","Data":"b9d700ba187d6e708d97b0a58454b2ea516667b455b7bce13bcbf1b30b162295"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.167618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" event={"ID":"0ecef83c-1f5a-4db1-93a9-a18476653e8a","Type":"ContainerStarted","Data":"d33b33904b5b5b306cba7027d21b69a5502ce322ef8bbca272020a5cfe38bbfd"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.167659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" event={"ID":"0ecef83c-1f5a-4db1-93a9-a18476653e8a","Type":"ContainerStarted","Data":"244584b6294e4153e90575eb3f7f4f47829c47cec613a9add1dcf482ac1607d0"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.168482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.169464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g8wzz" event={"ID":"dbd91a09-921f-4585-986a-90fd4a111781","Type":"ContainerStarted","Data":"7df547a4c6e760d11af3f1d5e93d87de0476e7e2d1dad07f89a7b6aadbe6aa37"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.169490 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g8wzz" event={"ID":"dbd91a09-921f-4585-986a-90fd4a111781","Type":"ContainerStarted","Data":"bcb2277a61660e83d17bc2d40b4eb0288e1169fc007457908f56ed02a02f1e61"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.170786 4717 patch_prober.go:28] interesting pod/console-operator-58897d9998-bxq5v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.170822 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" podUID="0ecef83c-1f5a-4db1-93a9-a18476653e8a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.174361 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.179187 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" event={"ID":"86542c4e-7f2b-4933-9d0d-737228524851","Type":"ContainerStarted","Data":"39d1cbfc87adf21f27f9798085ed1c90567909a1a41bc56cc73e84d2901c2272"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.179217 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" event={"ID":"86542c4e-7f2b-4933-9d0d-737228524851","Type":"ContainerStarted","Data":"743bb0d19d88c2972cba9ba7b9cb1ce2f7b050aab0e7f0cedfad6820a225c28b"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.181220 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.182457 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" event={"ID":"2f888dde-fb26-45b0-a084-f63a3c99bf50","Type":"ContainerStarted","Data":"61b326f2496a2b7dcc182c9e3460f84ef9fdd5a0659fb752a36d9c6470239a5a"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.181431 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6h28\" (UniqueName: \"kubernetes.io/projected/60d25c82-47d6-4706-8235-70fd592a984d-kube-api-access-p6h28\") pod \"marketplace-operator-79b997595-8q2r4\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.182607 4717 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ltlhm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.182648 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" podUID="86542c4e-7f2b-4933-9d0d-737228524851" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.184764 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" event={"ID":"eb856235-9997-4747-9a24-fca2600a68d9","Type":"ContainerStarted","Data":"388ae4b3855b81bb93d1670ebb8a8d431064cf6974567c0e2d0caf48dbe256f6"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.185100 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.195456 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.196159 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.196355 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.696332688 +0000 UTC m=+143.477866310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.204653 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.211698 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.212565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" event={"ID":"ffa8e999-61a7-4b74-8d0d-74652418374b","Type":"ContainerStarted","Data":"c62145a383ea89f3e5a777bafaee86cd361bbccd542b265ac11361d702c0c894"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.212615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" event={"ID":"ffa8e999-61a7-4b74-8d0d-74652418374b","Type":"ContainerStarted","Data":"bcd76adc45c4c508c3823d194cb26a948f9a70e2fea514df4aced8fdccf3489f"} Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.219963 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.230456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.242588 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bsjn6" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.245084 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.255613 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2rv95" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.282250 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.298791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.299627 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.79960724 +0000 UTC m=+143.581140862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.358639 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qm2kf"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.361819 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.376747 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8cf52"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.392307 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-phxph"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.404463 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.404724 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.904668862 +0000 UTC m=+143.686202484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.405074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.409132 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:48.909108903 +0000 UTC m=+143.690642525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.411842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.482409 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-st74w"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.489578 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-76zmn"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.509920 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.510319 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.010300248 +0000 UTC m=+143.791833860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.556752 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-47lf7"] Feb 21 21:48:48 crc kubenswrapper[4717]: W0221 21:48:48.578296 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6219e3_5e33_4c13_9c52_de506ba0b6a6.slice/crio-43dab0765dd4b22f43b1d7a516c59998be5f250171874caa1ee91a689a5c7657 WatchSource:0}: Error finding container 43dab0765dd4b22f43b1d7a516c59998be5f250171874caa1ee91a689a5c7657: Status 404 returned error can't find the container with id 43dab0765dd4b22f43b1d7a516c59998be5f250171874caa1ee91a689a5c7657 Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.582217 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.588240 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lj6tc"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.609918 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.609965 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.612112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.612473 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.112460814 +0000 UTC m=+143.893994436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.619155 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:48 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:48 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:48 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.619190 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.636058 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c5sr8"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.669777 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.677515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hplrj"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.695744 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l7kgh"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.700162 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv"] Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.715217 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.715711 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.215692043 +0000 UTC m=+143.997225665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: W0221 21:48:48.730456 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe4fc787_106a_4c35_ba6c_74a64a69db9f.slice/crio-bf289f0a8e555f614ba08e468a336d82d1a48de0dc7ad61ffd90634790e215bb WatchSource:0}: Error finding container bf289f0a8e555f614ba08e468a336d82d1a48de0dc7ad61ffd90634790e215bb: Status 404 returned error can't find the container with id bf289f0a8e555f614ba08e468a336d82d1a48de0dc7ad61ffd90634790e215bb Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.816614 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.817408 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.317393209 +0000 UTC m=+144.098926831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:48 crc kubenswrapper[4717]: I0221 21:48:48.918500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:48 crc kubenswrapper[4717]: E0221 21:48:48.918741 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.418695236 +0000 UTC m=+144.200228848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.023352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.024219 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.524203848 +0000 UTC m=+144.305737470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.050489 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" podStartSLOduration=122.050465087 podStartE2EDuration="2m2.050465087s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:49.048694956 +0000 UTC m=+143.830228578" watchObservedRunningTime="2026-02-21 21:48:49.050465087 +0000 UTC m=+143.831998699" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.100695 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" podStartSLOduration=122.100678609 podStartE2EDuration="2m2.100678609s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:49.099438061 +0000 UTC m=+143.880971683" watchObservedRunningTime="2026-02-21 21:48:49.100678609 +0000 UTC m=+143.882212231" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.129532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.130044 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.630025107 +0000 UTC m=+144.411558729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.231752 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.232693 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.732679055 +0000 UTC m=+144.514212677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.238099 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8q2r4"] Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.254480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" event={"ID":"d2206ca5-861f-4a89-8218-8c8a1264b2d8","Type":"ContainerStarted","Data":"9943e9bdd6a4b8d8a6dce9c4ba09529665be908da378ed23b6ed079fcb6ba63d"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.254582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" event={"ID":"d2206ca5-861f-4a89-8218-8c8a1264b2d8","Type":"ContainerStarted","Data":"fd8646a7cb7243b34619b1a5fd9f21b54de5a9ab14cb927dadd0b7ab22b2b97a"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.258243 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" event={"ID":"c6f53bb9-51b7-4a4b-b348-da4598ceccbc","Type":"ContainerStarted","Data":"071f7c0d1665c606586613c57e75c9f597114da18c3e90149154d8349fe2a0f8"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.262750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8cf52" event={"ID":"3f6219e3-5e33-4c13-9c52-de506ba0b6a6","Type":"ContainerStarted","Data":"43dab0765dd4b22f43b1d7a516c59998be5f250171874caa1ee91a689a5c7657"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.263844 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" event={"ID":"fe4fc787-106a-4c35-ba6c-74a64a69db9f","Type":"ContainerStarted","Data":"bf289f0a8e555f614ba08e468a336d82d1a48de0dc7ad61ffd90634790e215bb"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.266674 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" event={"ID":"a9315435-1144-473c-9d20-8a0ea85d1199","Type":"ContainerStarted","Data":"ad91c2d5d0a6c3eeb24463206c85b37a0e700eab56ab29334d7f4c5d76dfea5f"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.297556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" event={"ID":"57fc07c3-9f4a-4494-8a45-04efba3c358c","Type":"ContainerStarted","Data":"3bba96a6066a0f162f914056e9bf0e035cff11c874cbbc1ed4e6fc369d13c248"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.321682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" event={"ID":"bdefa569-260f-4b10-a611-ab781c4fea72","Type":"ContainerStarted","Data":"40fce9c657458db358767a526e0fc6c6c7bd78292b9c9cb7b82a3e8019d272f2"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.323271 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-g8wzz" podStartSLOduration=122.323261297 podStartE2EDuration="2m2.323261297s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:49.321989568 +0000 UTC m=+144.103523190" watchObservedRunningTime="2026-02-21 21:48:49.323261297 +0000 UTC m=+144.104794919" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.326010 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" event={"ID":"83f9314e-459c-4866-830b-80e171b696dd","Type":"ContainerStarted","Data":"a70e9a475aafc0122d6a04c7a6b0ababb5e357a98901f462856ca8dfb1649cf6"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.331088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" event={"ID":"eb856235-9997-4747-9a24-fca2600a68d9","Type":"ContainerStarted","Data":"a294ed082b2ee19a12354c91d438021dd5bcdbd5cda07bcd6ddab14776d12492"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.332564 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.333418 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.833397139 +0000 UTC m=+144.614930771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.335741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" event={"ID":"d105add5-0618-41cf-ae5d-59b739694e4c","Type":"ContainerStarted","Data":"8b4cbe7d323243073b18369ba4c18078cc7cdc895798c88776a5032994e4432b"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.337171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" event={"ID":"2f888dde-fb26-45b0-a084-f63a3c99bf50","Type":"ContainerStarted","Data":"d7f3267b4a520ff3d2a33a12353a6461d333b0a349dada42934943ce25ff8379"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.341236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-47lf7" event={"ID":"17bb07e6-67dd-4cc5-b979-9ef794228e81","Type":"ContainerStarted","Data":"6f8bf24181fcd83bc0444819a04171cb4c2426413dde130d90a805ca205d20a6"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.342365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" event={"ID":"87db50f7-cf1f-4f32-b6f4-4f641261c731","Type":"ContainerStarted","Data":"2963ae80a6a0ac70f4c6834de325ebd7825b757e2c8ec0cf82079498ba1fc5d5"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.344108 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bsjn6" event={"ID":"e172f7a2-4889-47d8-b0b1-a04114d0e328","Type":"ContainerStarted","Data":"f799a23b8fa90d7b4a771bfe806fbb4fc87e173c8b6e2a49dfe7f397346fe0e3"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.351240 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" event={"ID":"5c73e02c-cb77-47e2-bf8a-1092ada428d5","Type":"ContainerStarted","Data":"2c1aacf0d9100b99eedbff5861f8ed79eb3254b5f9201d39173d1f0fa76715b6"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.352253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" event={"ID":"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30","Type":"ContainerStarted","Data":"2d2b8a16ddbdaa9bb6049f15ba18f363dcc776d4bd21aa97b33dc27c13862100"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.353054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" event={"ID":"37a6cccd-b5a0-42bd-b580-3fe07356b864","Type":"ContainerStarted","Data":"5f8995570a6b3391b00f12e9ec45cf1409111894584972f90fbd24d89931beb8"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.355649 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" event={"ID":"520428c8-dcc3-4649-8b61-346774136b38","Type":"ContainerStarted","Data":"8d5e2c29ca4c3f655bdd9c16cffa59b358b5676a4f2d9a7007b193240ea6abf0"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.368945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" event={"ID":"d4fde7d3-808e-437a-a4b1-a3f44570ba55","Type":"ContainerStarted","Data":"580158d0ecfb3d3e1e84a3e6d41c396501007d3d60d8a547ac485133aec0e45c"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.376959 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" event={"ID":"ffa8e999-61a7-4b74-8d0d-74652418374b","Type":"ContainerStarted","Data":"e798e58e8a585bb2608c37613fa3eed1658dd9b8d1a0212435e52fe4b6d3256f"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.388504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" event={"ID":"56758c2e-648d-41fe-8758-439f0070d150","Type":"ContainerStarted","Data":"41db2e65a96021fab01dba4c38ff48954e8a2a8600e15290486a04b27ac69e60"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.397989 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" event={"ID":"290a3ba3-ff12-4724-9cdb-c41dab7b0827","Type":"ContainerStarted","Data":"186b2ae35389b2617eb935716c815d8b147062837a7dc5dabe0ca0e4098891e9"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.402562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" event={"ID":"72d1d285-af56-462a-80e0-985f8b689b10","Type":"ContainerStarted","Data":"36be2939d161a3e73293489f9685004e130e230907a7bd371f62939924d99566"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.404639 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" event={"ID":"6521b257-6f92-42e5-be73-72c45ecfc58a","Type":"ContainerStarted","Data":"fb4011e8273edbb2370a7fbe500eed0e4cfa581b4daf72e170f924c36962fecc"} Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.426418 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.436902 4717 csr.go:261] certificate signing request csr-gdmsp is approved, waiting to be issued Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.444419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.446753 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:49.946734369 +0000 UTC m=+144.728267991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.448569 4717 csr.go:257] certificate signing request csr-gdmsp is issued Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.486130 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bxq5v" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.550703 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.551287 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.051255678 +0000 UTC m=+144.832789300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.551954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.554475 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.054457951 +0000 UTC m=+144.835991573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.632444 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:49 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:49 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:49 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.633049 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.656769 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.657920 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.157900086 +0000 UTC m=+144.939433708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.686541 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zzd8r" podStartSLOduration=122.686517468 podStartE2EDuration="2m2.686517468s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:49.632518139 +0000 UTC m=+144.414051761" watchObservedRunningTime="2026-02-21 21:48:49.686517468 +0000 UTC m=+144.468051090" Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.767321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.767900 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.267885411 +0000 UTC m=+145.049419033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.876544 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv"] Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.879545 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.886828 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2rv95"] Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.888057 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.388018226 +0000 UTC m=+145.169551848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.944955 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj"] Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.964559 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dh4pw"] Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.990973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:49 crc kubenswrapper[4717]: E0221 21:48:49.991843 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.49183007 +0000 UTC m=+145.273363692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:49 crc kubenswrapper[4717]: I0221 21:48:49.998312 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.057882 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.057976 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.091658 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.092097 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.592057391 +0000 UTC m=+145.373591013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.094504 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.098118 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.59810426 +0000 UTC m=+145.379637882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.143206 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dk9hn" podStartSLOduration=122.143189796 podStartE2EDuration="2m2.143189796s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.098611361 +0000 UTC m=+144.880144983" watchObservedRunningTime="2026-02-21 21:48:50.143189796 +0000 UTC m=+144.924723418" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.145541 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nrfv" podStartSLOduration=123.14553624 podStartE2EDuration="2m3.14553624s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.144098847 +0000 UTC m=+144.925632469" watchObservedRunningTime="2026-02-21 21:48:50.14553624 +0000 UTC m=+144.927069852" Feb 21 21:48:50 crc kubenswrapper[4717]: W0221 21:48:50.180926 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d75838b_1762_4f29_89be_817b97eb22f0.slice/crio-c87ca0099eefc7ce18a7d56f73055b7dee119e60ab587e9375a34d85702db531 WatchSource:0}: Error finding container c87ca0099eefc7ce18a7d56f73055b7dee119e60ab587e9375a34d85702db531: Status 404 returned error can't find the container with id c87ca0099eefc7ce18a7d56f73055b7dee119e60ab587e9375a34d85702db531 Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.198772 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.199240 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.699221922 +0000 UTC m=+145.480755544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.218007 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z7hgg" podStartSLOduration=122.217985509 podStartE2EDuration="2m2.217985509s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.175543603 +0000 UTC m=+144.957077225" watchObservedRunningTime="2026-02-21 21:48:50.217985509 +0000 UTC m=+144.999519121" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.257725 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.300211 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.300580 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.800566219 +0000 UTC m=+145.582099841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.342376 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-27zzk" podStartSLOduration=123.34234827 podStartE2EDuration="2m3.34234827s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.328513485 +0000 UTC m=+145.110047117" watchObservedRunningTime="2026-02-21 21:48:50.34234827 +0000 UTC m=+145.123881892" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.345329 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm"] Feb 21 21:48:50 crc kubenswrapper[4717]: W0221 21:48:50.375874 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff47caaa_cf37_40ea_8c6c_457189a5432b.slice/crio-89f0ce1bc30255f7617ae763eaa16184b24cc4cd571793b071508ad94a42b342 WatchSource:0}: Error finding container 89f0ce1bc30255f7617ae763eaa16184b24cc4cd571793b071508ad94a42b342: Status 404 returned error can't find the container with id 89f0ce1bc30255f7617ae763eaa16184b24cc4cd571793b071508ad94a42b342 Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.402533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.402885 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:50.902848048 +0000 UTC m=+145.684381670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.411563 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8vn67"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.444595 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dbpvk"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.450302 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-21 21:43:49 +0000 UTC, rotation deadline is 2026-12-01 15:44:11.456621649 +0000 UTC Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.450365 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6785h55m21.00625827s for next certificate rotation Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.472987 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd"] Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.503813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.504533 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.004507963 +0000 UTC m=+145.786041675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.512600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" event={"ID":"83f9314e-459c-4866-830b-80e171b696dd","Type":"ContainerStarted","Data":"4c02810175e62d54ef7e44ae13d48b3904d32c1bfda36a546c8bdfdeb7b8008a"} Feb 21 21:48:50 crc kubenswrapper[4717]: W0221 21:48:50.531182 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882d3e28_3823_42b6_8973_cd07043ad24d.slice/crio-5f30184acfc3710658b58af4fd86139408862d69aff3e6f2f037aec9fcc363b8 WatchSource:0}: Error finding container 5f30184acfc3710658b58af4fd86139408862d69aff3e6f2f037aec9fcc363b8: Status 404 returned error can't find the container with id 5f30184acfc3710658b58af4fd86139408862d69aff3e6f2f037aec9fcc363b8 Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.548448 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" event={"ID":"56758c2e-648d-41fe-8758-439f0070d150","Type":"ContainerStarted","Data":"fa82a6ef3a5c36da5ccbd2231381fa1d69897968bb45475936739c966e110a8b"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.596624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" event={"ID":"37a6cccd-b5a0-42bd-b580-3fe07356b864","Type":"ContainerStarted","Data":"016d5047695d3edabe979dbb9fb160f9e3b5f4f2765711edf0d53d598af73f00"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.605950 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.606518 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.106491085 +0000 UTC m=+145.888024707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.615476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8cf52" event={"ID":"3f6219e3-5e33-4c13-9c52-de506ba0b6a6","Type":"ContainerStarted","Data":"d11c861a9f869ebf513c01201d00a51e9f495760e6e43191fac94bbbbbe86764"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.616401 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.616539 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:50 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:50 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:50 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.616571 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.622412 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cf52 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.622477 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cf52" podUID="3f6219e3-5e33-4c13-9c52-de506ba0b6a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.636223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dh4pw" event={"ID":"07887dc7-0606-470d-9c98-fa209d695e60","Type":"ContainerStarted","Data":"63f968c10d84a7afa718fec3e0dff21942310699b96552261b91f8a9bd80b208"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.637722 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c4kcz" podStartSLOduration=123.637701705 podStartE2EDuration="2m3.637701705s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.637445399 +0000 UTC m=+145.418979021" watchObservedRunningTime="2026-02-21 21:48:50.637701705 +0000 UTC m=+145.419235327" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.687827 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" event={"ID":"a9315435-1144-473c-9d20-8a0ea85d1199","Type":"ContainerStarted","Data":"35b12fe071d0a0d575ff4594c52c128022600850229167bc104f040d57516386"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.709286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.709607 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.209591352 +0000 UTC m=+145.991124984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.744443 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" event={"ID":"5c73e02c-cb77-47e2-bf8a-1092ada428d5","Type":"ContainerStarted","Data":"7aa4d2d943f741685966146550b2f5de890fd573890da1dc28289ff43ff2689f"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.745582 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.760118 4717 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-76zmn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.760190 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" podUID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.772214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" event={"ID":"290a3ba3-ff12-4724-9cdb-c41dab7b0827","Type":"ContainerStarted","Data":"fb4ead394556ba1f72a45e5741909fed687b04487bd77ee4faa68561a3585b22"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.780286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" event={"ID":"fe4fc787-106a-4c35-ba6c-74a64a69db9f","Type":"ContainerStarted","Data":"548081df70de28f16bdb0923077f292b20717d0987c8acc3be85d190ed65388c"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.782038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" event={"ID":"87db50f7-cf1f-4f32-b6f4-4f641261c731","Type":"ContainerStarted","Data":"0c11f043da0c1d487c685466c155f29f21667aab1150721b109dd4a99db1fd3b"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.783666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-47lf7" event={"ID":"17bb07e6-67dd-4cc5-b979-9ef794228e81","Type":"ContainerStarted","Data":"d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.784647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" event={"ID":"450eb855-2d6d-4503-9a7e-1980d8e97346","Type":"ContainerStarted","Data":"c80f836e439309bb35a76fd16bfdf56c5e1cf94d3da04180ffc50506a6d8cac1"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.785776 4717 generic.go:334] "Generic (PLEG): container finished" podID="bdefa569-260f-4b10-a611-ab781c4fea72" containerID="337e1cd444c97dc3733a7f5bec63c01614c90d92aee4436e975e00e4f8cbdb59" exitCode=0 Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.785814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" event={"ID":"bdefa569-260f-4b10-a611-ab781c4fea72","Type":"ContainerDied","Data":"337e1cd444c97dc3733a7f5bec63c01614c90d92aee4436e975e00e4f8cbdb59"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.798640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bsjn6" event={"ID":"e172f7a2-4889-47d8-b0b1-a04114d0e328","Type":"ContainerStarted","Data":"8c6f69a4aefb38856489162b0a975cb67106cedc2af1a2769cce5b684dbc703c"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.814036 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8cf52" podStartSLOduration=123.81401915000001 podStartE2EDuration="2m3.81401915s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.687616052 +0000 UTC m=+145.469149674" watchObservedRunningTime="2026-02-21 21:48:50.81401915 +0000 UTC m=+145.595552772" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.815106 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" podStartSLOduration=123.815099965 podStartE2EDuration="2m3.815099965s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.813383116 +0000 UTC m=+145.594916728" watchObservedRunningTime="2026-02-21 21:48:50.815099965 +0000 UTC m=+145.596633587" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.818420 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.820165 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.320137419 +0000 UTC m=+146.101671031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.841139 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bsjn6" podStartSLOduration=6.841123616 podStartE2EDuration="6.841123616s" podCreationTimestamp="2026-02-21 21:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.839358657 +0000 UTC m=+145.620892279" watchObservedRunningTime="2026-02-21 21:48:50.841123616 +0000 UTC m=+145.622657238" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.861617 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tplh" podStartSLOduration=123.861597873 podStartE2EDuration="2m3.861597873s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.861052161 +0000 UTC m=+145.642585783" watchObservedRunningTime="2026-02-21 21:48:50.861597873 +0000 UTC m=+145.643131495" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.919042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" event={"ID":"2e4f85de-f15f-40b8-9bfe-914862b6c20e","Type":"ContainerStarted","Data":"c623b75449c5acc2e2c9b1ca1911df7175dcf6f5281cdcd5761914d2d191a65c"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.919771 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:50 crc kubenswrapper[4717]: E0221 21:48:50.920307 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.42029354 +0000 UTC m=+146.201827162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.927126 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hplrj" podStartSLOduration=122.927089544 podStartE2EDuration="2m2.927089544s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.919550602 +0000 UTC m=+145.701084224" watchObservedRunningTime="2026-02-21 21:48:50.927089544 +0000 UTC m=+145.708623166" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.944069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" event={"ID":"d105add5-0618-41cf-ae5d-59b739694e4c","Type":"ContainerStarted","Data":"8347825bb77d0cf07347a3a1246851aa0cd174bca5318a07ff31ee53765d7c74"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.945068 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.966283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" event={"ID":"5d75838b-1762-4f29-89be-817b97eb22f0","Type":"ContainerStarted","Data":"c87ca0099eefc7ce18a7d56f73055b7dee119e60ab587e9375a34d85702db531"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.984875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" event={"ID":"395e9ebb-c913-43df-bab9-f4e181167728","Type":"ContainerStarted","Data":"029f21145d816ad4c303a98ec4eda3bbf9b4ee28bdbcd189a895eb1880e4f931"} Feb 21 21:48:50 crc kubenswrapper[4717]: I0221 21:48:50.991676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2rv95" event={"ID":"e561b490-ea66-4906-9f8b-299a7e5909cf","Type":"ContainerStarted","Data":"aa34b03e0545889b44bb4342e834c036847bdcf339c278ab83ee610d8e3a9dc9"} Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:50.998825 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" event={"ID":"ff47caaa-cf37-40ea-8c6c-457189a5432b","Type":"ContainerStarted","Data":"89f0ce1bc30255f7617ae763eaa16184b24cc4cd571793b071508ad94a42b342"} Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.007928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" event={"ID":"60d25c82-47d6-4706-8235-70fd592a984d","Type":"ContainerStarted","Data":"cdff9510ac1d6bce743f1b01fe034ceb14fed18f7fba25c9e42e698a3b1c3fe6"} Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.008902 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.017785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" event={"ID":"907ac86c-cb8a-41c6-b88f-4ce3c9ffdc30","Type":"ContainerStarted","Data":"29de1a98404595eddd7801239d89b7e6a479d6ca5f8f58802d555bf8d22ebeea"} Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.022809 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.024417 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.524381159 +0000 UTC m=+146.305914771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.032414 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8q2r4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.032448 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" podUID="60d25c82-47d6-4706-8235-70fd592a984d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.048653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" event={"ID":"520428c8-dcc3-4649-8b61-346774136b38","Type":"ContainerStarted","Data":"dc11f405e88e3e0e176630f22ff1b0d900029d33e25c5225d7ba70c8f89e0343"} Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.053792 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-47lf7" podStartSLOduration=124.053768318 podStartE2EDuration="2m4.053768318s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:50.958326075 +0000 UTC m=+145.739859707" watchObservedRunningTime="2026-02-21 21:48:51.053768318 +0000 UTC m=+145.835301940" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.090534 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" podStartSLOduration=123.090517516 podStartE2EDuration="2m3.090517516s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:51.087606268 +0000 UTC m=+145.869139890" watchObservedRunningTime="2026-02-21 21:48:51.090517516 +0000 UTC m=+145.872051138" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.124438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.128911 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.628878029 +0000 UTC m=+146.410411651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.225810 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.226431 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.72641358 +0000 UTC m=+146.507947202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.239964 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" podStartSLOduration=124.239934997 podStartE2EDuration="2m4.239934997s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:51.15752194 +0000 UTC m=+145.939055562" watchObservedRunningTime="2026-02-21 21:48:51.239934997 +0000 UTC m=+146.021468619" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.305316 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" podStartSLOduration=123.305291705 podStartE2EDuration="2m3.305291705s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:51.223533964 +0000 UTC m=+146.005067586" watchObservedRunningTime="2026-02-21 21:48:51.305291705 +0000 UTC m=+146.086825327" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.305702 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2rv95" podStartSLOduration=6.305697104 podStartE2EDuration="6.305697104s" podCreationTimestamp="2026-02-21 21:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:51.289886405 +0000 UTC m=+146.071420027" watchObservedRunningTime="2026-02-21 21:48:51.305697104 +0000 UTC m=+146.087230726" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.325997 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qm2kf" podStartSLOduration=124.325977797 podStartE2EDuration="2m4.325977797s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:51.324130724 +0000 UTC m=+146.105664346" watchObservedRunningTime="2026-02-21 21:48:51.325977797 +0000 UTC m=+146.107511419" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.328556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.333331 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.833307183 +0000 UTC m=+146.614840805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.430505 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.431854 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:51.931821466 +0000 UTC m=+146.713355088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.533984 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.534480 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.034460294 +0000 UTC m=+146.815993916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.577371 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.628718 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:51 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:51 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:51 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.629354 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.637648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.638134 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.138116264 +0000 UTC m=+146.919649886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.741318 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.741946 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.241931977 +0000 UTC m=+147.023465599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.847487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.847850 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.347833008 +0000 UTC m=+147.129366630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:51 crc kubenswrapper[4717]: I0221 21:48:51.948532 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:51 crc kubenswrapper[4717]: E0221 21:48:51.949435 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.449423361 +0000 UTC m=+147.230956983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.053292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.053696 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.553676535 +0000 UTC m=+147.335210157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.097182 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2rv95" event={"ID":"e561b490-ea66-4906-9f8b-299a7e5909cf","Type":"ContainerStarted","Data":"ab43f182fb4b3d878dd8dd664f2f18ee91820b0864438cf7acbbe3d8040cd4eb"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.115949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" event={"ID":"882d3e28-3823-42b6-8973-cd07043ad24d","Type":"ContainerStarted","Data":"eaa1cf34360acd3f6ab2c40039745cb495b6fa2e27483b7db000a6d11e006a64"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.115997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" event={"ID":"882d3e28-3823-42b6-8973-cd07043ad24d","Type":"ContainerStarted","Data":"5f30184acfc3710658b58af4fd86139408862d69aff3e6f2f037aec9fcc363b8"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.135875 4717 generic.go:334] "Generic (PLEG): container finished" podID="72d1d285-af56-462a-80e0-985f8b689b10" containerID="f2affd92ffa19a5c720890fb476425129ac471678ce62e84c3675ad3ad3a2558" exitCode=0 Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.136412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" event={"ID":"72d1d285-af56-462a-80e0-985f8b689b10","Type":"ContainerDied","Data":"f2affd92ffa19a5c720890fb476425129ac471678ce62e84c3675ad3ad3a2558"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.157825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.159038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" event={"ID":"6521b257-6f92-42e5-be73-72c45ecfc58a","Type":"ContainerStarted","Data":"a0a1aaae88c8ea4e1daf11c2efbff79b95a91ab66fc464c4ee17c1dc2e2cb104"} Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.159225 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.659203468 +0000 UTC m=+147.440737090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.184893 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8vn67" podStartSLOduration=124.184851972 podStartE2EDuration="2m4.184851972s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.178583629 +0000 UTC m=+146.960117251" watchObservedRunningTime="2026-02-21 21:48:52.184851972 +0000 UTC m=+146.966385594" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.188307 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" event={"ID":"290a3ba3-ff12-4724-9cdb-c41dab7b0827","Type":"ContainerStarted","Data":"b3a7ca11c0d5c96fb79b2e878c063e7cb2c44a1e1d14293d0d22c723d7ee5437"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.194285 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" event={"ID":"395e9ebb-c913-43df-bab9-f4e181167728","Type":"ContainerStarted","Data":"a0b923c39af6615759bad8d22c2b6c85a5539622159ab6a9bd98ffb882168945"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.195703 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.202991 4717 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ds4d5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.203036 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" podUID="395e9ebb-c913-43df-bab9-f4e181167728" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.233406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" event={"ID":"56758c2e-648d-41fe-8758-439f0070d150","Type":"ContainerStarted","Data":"23bc2942e6efcd66e36d6c0fff9df149f16c178b7377154e6fd83e94a3138d1f"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.260200 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.261976 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.761950387 +0000 UTC m=+147.543484009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.263255 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" event={"ID":"2e4f85de-f15f-40b8-9bfe-914862b6c20e","Type":"ContainerStarted","Data":"d2f72a8982dddc4c7543001da67e175b6d4f1e09d82f9e5963a15ad1e0c6a7f1"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.264326 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.288732 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dh4pw" event={"ID":"07887dc7-0606-470d-9c98-fa209d695e60","Type":"ContainerStarted","Data":"c8c5033e94a3f1554a15be8914033c91650c7967e80bf5f41f510f94d505c15a"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.289483 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dh4pw" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.289573 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.332111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" event={"ID":"ff47caaa-cf37-40ea-8c6c-457189a5432b","Type":"ContainerStarted","Data":"8e0b2b1c07a303544027ede2352a2eee86fafa2b1dc138440d8c960f3d0e2501"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.361571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.362417 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.862405024 +0000 UTC m=+147.643938646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.365096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" event={"ID":"c6f53bb9-51b7-4a4b-b348-da4598ceccbc","Type":"ContainerStarted","Data":"ab3cde4210ba7a34245a1db4607eee805f5df832cc010cf0f0c4df4e245dfad3"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.395077 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-st74w" event={"ID":"520428c8-dcc3-4649-8b61-346774136b38","Type":"ContainerStarted","Data":"255e2165474a1ee9073392df7f8d89e00aa41f302633068ab1cf78cc7e792843"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.399379 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-l7kgh" podStartSLOduration=125.399354465 podStartE2EDuration="2m5.399354465s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.321245408 +0000 UTC m=+147.102779030" watchObservedRunningTime="2026-02-21 21:48:52.399354465 +0000 UTC m=+147.180888087" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.444104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" event={"ID":"d2206ca5-861f-4a89-8218-8c8a1264b2d8","Type":"ContainerStarted","Data":"0fece1874c914f4012193098f7bec7c7aa7d6d93fd636609fc8ab0bb38841aac"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.466308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.467918 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:52.967897706 +0000 UTC m=+147.749431328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.463571 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zkqdj" podStartSLOduration=124.463549087 podStartE2EDuration="2m4.463549087s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.401398562 +0000 UTC m=+147.182932184" watchObservedRunningTime="2026-02-21 21:48:52.463549087 +0000 UTC m=+147.245082709" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.485205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" event={"ID":"a26b8bcd-9a3f-4fd0-8d35-2272127096e8","Type":"ContainerStarted","Data":"be913bfb599118a78c817d61fc0235aced5883e254519713840e9e10a96956ee"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.519421 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6a85015-ccec-4075-8749-fc07bbea7344" containerID="a409520761de8d220c4c809c603ba4a0c4246ecef2c50ea370b94391e07c28e2" exitCode=0 Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.519530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" event={"ID":"c6a85015-ccec-4075-8749-fc07bbea7344","Type":"ContainerDied","Data":"a409520761de8d220c4c809c603ba4a0c4246ecef2c50ea370b94391e07c28e2"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.519559 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" event={"ID":"c6a85015-ccec-4075-8749-fc07bbea7344","Type":"ContainerStarted","Data":"d2dcf9517496ddd5a804aa909d13ee4051ccf4fda4d15fde28efff17d7cd5d2a"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.532231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" event={"ID":"d4fde7d3-808e-437a-a4b1-a3f44570ba55","Type":"ContainerStarted","Data":"ee2f93b9964fbb1a048b0f7565aee19889b3124f7e12ef631e3fe0e6c1d64f94"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.565544 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-84tzn" podStartSLOduration=124.565528659 podStartE2EDuration="2m4.565528659s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.478490528 +0000 UTC m=+147.260024150" watchObservedRunningTime="2026-02-21 21:48:52.565528659 +0000 UTC m=+147.347062281" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.573962 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.574130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" event={"ID":"60d25c82-47d6-4706-8235-70fd592a984d","Type":"ContainerStarted","Data":"876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.575232 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8q2r4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.575266 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" podUID="60d25c82-47d6-4706-8235-70fd592a984d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.576371 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.076358276 +0000 UTC m=+147.857891898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.610004 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bb9bf" podStartSLOduration=124.609985782 podStartE2EDuration="2m4.609985782s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.56603055 +0000 UTC m=+147.347564172" watchObservedRunningTime="2026-02-21 21:48:52.609985782 +0000 UTC m=+147.391519404" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.629786 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:52 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:52 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:52 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.630240 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.638347 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" event={"ID":"450eb855-2d6d-4503-9a7e-1980d8e97346","Type":"ContainerStarted","Data":"f168c1a45fac9a09bf48c9e2e1de86c5f6eaab1f1265501af4315869e6fcb1dc"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.702651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.709682 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.209647781 +0000 UTC m=+147.991181403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.710235 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.710261 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dh4pw" podStartSLOduration=8.710228513 podStartE2EDuration="8.710228513s" podCreationTimestamp="2026-02-21 21:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.630319094 +0000 UTC m=+147.411852716" watchObservedRunningTime="2026-02-21 21:48:52.710228513 +0000 UTC m=+147.491762135" Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.716068 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.216026776 +0000 UTC m=+147.997560398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.744274 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" podStartSLOduration=124.744246679 podStartE2EDuration="2m4.744246679s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.700266947 +0000 UTC m=+147.481800569" watchObservedRunningTime="2026-02-21 21:48:52.744246679 +0000 UTC m=+147.525780301" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.745191 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" event={"ID":"6dcf9089-19f0-421c-83bc-f5ad21bb07a5","Type":"ContainerStarted","Data":"420bbcc685ba9c0f02db9a2ae37cc469067156401782d818badedb6ddb46c984"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.745340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" event={"ID":"6dcf9089-19f0-421c-83bc-f5ad21bb07a5","Type":"ContainerStarted","Data":"038c61ce5ee8ae832864bff4a094229bc34a57453c4dca0be1d502a247d40a5b"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.747220 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.768725 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" podStartSLOduration=125.768691955 podStartE2EDuration="2m5.768691955s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.748347492 +0000 UTC m=+147.529881134" watchObservedRunningTime="2026-02-21 21:48:52.768691955 +0000 UTC m=+147.550225577" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.774589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" event={"ID":"a9315435-1144-473c-9d20-8a0ea85d1199","Type":"ContainerStarted","Data":"3af4c351f5978ac5ca8b7e315ce7323de2c7ba78934fdbb47f0b40ad891709db"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.775422 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.800891 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-pq4lv" podStartSLOduration=125.800872198 podStartE2EDuration="2m5.800872198s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.799465716 +0000 UTC m=+147.580999338" watchObservedRunningTime="2026-02-21 21:48:52.800872198 +0000 UTC m=+147.582405820" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.808483 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" event={"ID":"83f9314e-459c-4866-830b-80e171b696dd","Type":"ContainerStarted","Data":"60f16846b2beba95ce98a145adc7771b3fa994a578ac70d69a621f06b7365f52"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.812276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.814013 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.313993506 +0000 UTC m=+148.095527128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.848934 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" event={"ID":"5d75838b-1762-4f29-89be-817b97eb22f0","Type":"ContainerStarted","Data":"a833a5cdf0c8b5ef17bf07cb37b9f8073f19034c2e053037353f46ae6f4f7ffa"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.849663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" event={"ID":"5d75838b-1762-4f29-89be-817b97eb22f0","Type":"ContainerStarted","Data":"4d26d6e556aef106d7458ab41bc54adbb9d8995f49f090a9ad8e1ad8823e82dd"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.888118 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" event={"ID":"369bb866-95f1-42db-8e35-9e89b6f0d157","Type":"ContainerStarted","Data":"98e302168ca25584c5d381aa1a247e82b01146d66c2e89c0f9b2ae2d816d12c3"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.888463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" event={"ID":"369bb866-95f1-42db-8e35-9e89b6f0d157","Type":"ContainerStarted","Data":"e1f970feabf4e361bb62d554f87c1b5c378efc4710c5d222a1ca15cfe6326559"} Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.892761 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cf52 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.892963 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cf52" podUID="3f6219e3-5e33-4c13-9c52-de506ba0b6a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.915307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:52 crc kubenswrapper[4717]: E0221 21:48:52.919350 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.419334155 +0000 UTC m=+148.200867777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.971556 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r68hh" podStartSLOduration=124.971531873 podStartE2EDuration="2m4.971531873s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.97139218 +0000 UTC m=+147.752925802" watchObservedRunningTime="2026-02-21 21:48:52.971531873 +0000 UTC m=+147.753065485" Feb 21 21:48:52 crc kubenswrapper[4717]: I0221 21:48:52.972217 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p6d8p" podStartSLOduration=124.972210598 podStartE2EDuration="2m4.972210598s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:52.920621945 +0000 UTC m=+147.702155567" watchObservedRunningTime="2026-02-21 21:48:52.972210598 +0000 UTC m=+147.753744220" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.028744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.029391 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.52937166 +0000 UTC m=+148.310905282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.046802 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-phxph" podStartSLOduration=125.046785707 podStartE2EDuration="2m5.046785707s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:53.04426788 +0000 UTC m=+147.825801502" watchObservedRunningTime="2026-02-21 21:48:53.046785707 +0000 UTC m=+147.828319329" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.120502 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" podStartSLOduration=125.120479934 podStartE2EDuration="2m5.120479934s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:53.07108296 +0000 UTC m=+147.852616582" watchObservedRunningTime="2026-02-21 21:48:53.120479934 +0000 UTC m=+147.902013556" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.136330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.136695 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.636682954 +0000 UTC m=+148.418216576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.156224 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.190120 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" podStartSLOduration=126.19010028 podStartE2EDuration="2m6.19010028s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:53.187373917 +0000 UTC m=+147.968907539" watchObservedRunningTime="2026-02-21 21:48:53.19010028 +0000 UTC m=+147.971633902" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.190568 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9j4nq" podStartSLOduration=125.19056372 podStartE2EDuration="2m5.19056372s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:53.140349727 +0000 UTC m=+147.921883349" watchObservedRunningTime="2026-02-21 21:48:53.19056372 +0000 UTC m=+147.972097342" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.239047 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.239433 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.739414642 +0000 UTC m=+148.520948264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.242071 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" podStartSLOduration=125.242054012 podStartE2EDuration="2m5.242054012s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:53.229289862 +0000 UTC m=+148.010823484" watchObservedRunningTime="2026-02-21 21:48:53.242054012 +0000 UTC m=+148.023587634" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.308363 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5mbd" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.341945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.342333 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.842317866 +0000 UTC m=+148.623851488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.443408 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.444005 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.94396979 +0000 UTC m=+148.725503412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.444137 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.444473 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:53.944459911 +0000 UTC m=+148.725993533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.545657 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.545831 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.045790218 +0000 UTC m=+148.827323840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.546180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.546553 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.046539286 +0000 UTC m=+148.828072908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.622938 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:53 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:53 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:53 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.623020 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.647143 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.647332 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.14729658 +0000 UTC m=+148.928830202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.647590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.647639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.647996 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.147988165 +0000 UTC m=+148.929521787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.655507 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.749304 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.749541 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.249502947 +0000 UTC m=+149.031036579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.750089 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.750693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.750829 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.751260 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.251243006 +0000 UTC m=+149.032776628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.751827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.754462 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.801909 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.813820 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.851982 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.852241 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.352189095 +0000 UTC m=+149.133722737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.852284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.852338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.852851 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.352840609 +0000 UTC m=+149.134374291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.864773 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.918511 4717 generic.go:334] "Generic (PLEG): container finished" podID="450eb855-2d6d-4503-9a7e-1980d8e97346" containerID="f168c1a45fac9a09bf48c9e2e1de86c5f6eaab1f1265501af4315869e6fcb1dc" exitCode=0 Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.918629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" event={"ID":"450eb855-2d6d-4503-9a7e-1980d8e97346","Type":"ContainerDied","Data":"f168c1a45fac9a09bf48c9e2e1de86c5f6eaab1f1265501af4315869e6fcb1dc"} Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.928564 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" event={"ID":"c6f53bb9-51b7-4a4b-b348-da4598ceccbc","Type":"ContainerStarted","Data":"4e623d3c3dd91fe9b2778a1422aff6f528e7e7b64067110de26555f7e9de1280"} Feb 21 21:48:53 crc kubenswrapper[4717]: I0221 21:48:53.954518 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:53 crc kubenswrapper[4717]: E0221 21:48:53.954998 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.454978425 +0000 UTC m=+149.236512047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.052487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" event={"ID":"72d1d285-af56-462a-80e0-985f8b689b10","Type":"ContainerStarted","Data":"c79e2f0a33165edab1ed9c5feef363e374ad82684f685daaef6d8d24687b5d6b"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.052548 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" event={"ID":"72d1d285-af56-462a-80e0-985f8b689b10","Type":"ContainerStarted","Data":"1ed0b7693da079d414e77437764c6103aac1e6e44d4f080ca5f26047b97e45a3"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.055329 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" event={"ID":"bdefa569-260f-4b10-a611-ab781c4fea72","Type":"ContainerStarted","Data":"ce8ea38d9cf4e38c232a2b39849eb5a581a27bf4ce4bac2ba774a7ae9a295c98"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.055982 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.063011 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.065409 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.565393579 +0000 UTC m=+149.346927201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.066779 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" event={"ID":"a26b8bcd-9a3f-4fd0-8d35-2272127096e8","Type":"ContainerStarted","Data":"05a1ecefde0ed66b72b501e09c669cfe2a76727ca2a7f3e31f0dcabe8d8d7c0d"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.071773 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lj6tc" podStartSLOduration=127.071739704 podStartE2EDuration="2m7.071739704s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:54.000302267 +0000 UTC m=+148.781835899" watchObservedRunningTime="2026-02-21 21:48:54.071739704 +0000 UTC m=+148.853273326" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.071938 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" podStartSLOduration=127.071932478 podStartE2EDuration="2m7.071932478s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:54.060358265 +0000 UTC m=+148.841891887" watchObservedRunningTime="2026-02-21 21:48:54.071932478 +0000 UTC m=+148.853466110" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.108953 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" podStartSLOduration=127.10893106 podStartE2EDuration="2m7.10893106s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:54.107721022 +0000 UTC m=+148.889254644" watchObservedRunningTime="2026-02-21 21:48:54.10893106 +0000 UTC m=+148.890464672" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.109745 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" event={"ID":"c6a85015-ccec-4075-8749-fc07bbea7344","Type":"ContainerStarted","Data":"9cd64a39bba70c6eaeecaef57b10c619bedfc53feeff4f2e79a7f01360a7c001"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.123985 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.153429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dh4pw" event={"ID":"07887dc7-0606-470d-9c98-fa209d695e60","Type":"ContainerStarted","Data":"cdd0695f8da1030194466a9fe559c31afe5bb63fc55f5957627ec7efd4856db5"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.172007 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.173159 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.673138162 +0000 UTC m=+149.454671784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.188854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6sxz" event={"ID":"369bb866-95f1-42db-8e35-9e89b6f0d157","Type":"ContainerStarted","Data":"af3254a916421ec9b088e8ad823f2614a550d47ed597b503e21aae71ff137fe2"} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.195127 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cf52 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.195227 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cf52" podUID="3f6219e3-5e33-4c13-9c52-de506ba0b6a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.207149 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.234486 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ds4d5" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.235500 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" podStartSLOduration=126.235475112 podStartE2EDuration="2m6.235475112s" podCreationTimestamp="2026-02-21 21:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:54.166271686 +0000 UTC m=+148.947805308" watchObservedRunningTime="2026-02-21 21:48:54.235475112 +0000 UTC m=+149.017008734" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.273591 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.277052 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.777033277 +0000 UTC m=+149.558566899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.375575 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.376118 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.876078843 +0000 UTC m=+149.657612465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.376328 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.377454 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.877426773 +0000 UTC m=+149.658960395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: W0221 21:48:54.468012 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-a8709c75400e838f5364846bc13b1e17926fdc5d9b6678a4b6756f7262cf2ed4 WatchSource:0}: Error finding container a8709c75400e838f5364846bc13b1e17926fdc5d9b6678a4b6756f7262cf2ed4: Status 404 returned error can't find the container with id a8709c75400e838f5364846bc13b1e17926fdc5d9b6678a4b6756f7262cf2ed4 Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.478604 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.478963 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:54.978940875 +0000 UTC m=+149.760474497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.579674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.580089 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:55.080077168 +0000 UTC m=+149.861610790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.624938 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:54 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:54 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:54 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.624995 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.680942 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.681403 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:55.181380044 +0000 UTC m=+149.962913666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.690493 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9kmrb"] Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.691774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.697398 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.751020 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9kmrb"] Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.765564 4717 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.792243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dh5k\" (UniqueName: \"kubernetes.io/projected/80125217-20f6-4337-8be2-8874b40aa10e-kube-api-access-5dh5k\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.792297 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-catalog-content\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.792423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-utilities\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.792503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.792820 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:55.292808061 +0000 UTC m=+150.074341683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.884591 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7bn5"] Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.891934 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.904314 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.906135 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.906555 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-21 21:48:55.40653155 +0000 UTC m=+150.188065172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.907105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-catalog-content\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.907439 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-utilities\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.907671 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-catalog-content\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.907807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-utilities\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.907930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.908050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dh5k\" (UniqueName: \"kubernetes.io/projected/80125217-20f6-4337-8be2-8874b40aa10e-kube-api-access-5dh5k\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.908756 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-utilities\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: E0221 21:48:54.909111 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-21 21:48:55.409096449 +0000 UTC m=+150.190630071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wqr6g" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.913537 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7bn5"] Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.935529 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dh5k\" (UniqueName: \"kubernetes.io/projected/80125217-20f6-4337-8be2-8874b40aa10e-kube-api-access-5dh5k\") pod \"community-operators-9kmrb\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.940761 4717 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-21T21:48:54.765589201Z","Handler":null,"Name":""} Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.952669 4717 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 21 21:48:54 crc kubenswrapper[4717]: I0221 21:48:54.953186 4717 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.009690 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.010453 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-catalog-content\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.010555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwqx9\" (UniqueName: \"kubernetes.io/projected/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-kube-api-access-nwqx9\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.010658 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-utilities\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.011263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-utilities\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.020430 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.032727 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.079667 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99cnm"] Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.080946 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.092044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99cnm"] Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116333 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-utilities\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-catalog-content\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwqx9\" (UniqueName: \"kubernetes.io/projected/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-kube-api-access-nwqx9\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116499 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwpm\" (UniqueName: \"kubernetes.io/projected/73a32126-0013-46c3-8562-1a23d1d207b1-kube-api-access-bjwpm\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-catalog-content\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.116940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-catalog-content\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.122194 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.122229 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.149537 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwqx9\" (UniqueName: \"kubernetes.io/projected/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-kube-api-access-nwqx9\") pod \"certified-operators-d7bn5\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.167260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wqr6g\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.197810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"20bac1bb6fbdc4ec0dd2c9806228258fa21c6c5c87066a91124d4535cfdbb47d"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.198494 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fa27040b836da771b2cf49ff54cdfc8a9c5f3dfbc4c80a9ff53cfd4014f30a03"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.199635 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.200928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"159d06ff9cc0792fc10218655f4303fd03fca64a94b16b15886cc622fa867ee9"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.200958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a8709c75400e838f5364846bc13b1e17926fdc5d9b6678a4b6756f7262cf2ed4"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.210755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5d261ef54a5e89e96743fc15d498781003ae7a441a26441d2777b458b26d0f36"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.210818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"178e79eb7e2f93595bf35c0a6bf6d90df31d77be73cb6c358342eb4614986c0c"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.218838 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" event={"ID":"a26b8bcd-9a3f-4fd0-8d35-2272127096e8","Type":"ContainerStarted","Data":"135c95027fee7a4f2d1b62fb741f104f285c8c454ae9649bc7bada2ab2c8f0d6"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.218889 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" event={"ID":"a26b8bcd-9a3f-4fd0-8d35-2272127096e8","Type":"ContainerStarted","Data":"0d6929163f44bc05c2da0b7b7cc3146f21b3a6297fa621dc1c2a648dd39217bb"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.218925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" event={"ID":"a26b8bcd-9a3f-4fd0-8d35-2272127096e8","Type":"ContainerStarted","Data":"906b59cefde4fccc75c8f419328e812b00eb1c1954efa1fa0fdf6b1467f4f0e1"} Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.219931 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-utilities\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.220119 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwpm\" (UniqueName: \"kubernetes.io/projected/73a32126-0013-46c3-8562-1a23d1d207b1-kube-api-access-bjwpm\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.220141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-catalog-content\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.229470 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.231175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-utilities\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.231340 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-catalog-content\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.240691 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b4752" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.261837 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.271653 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwpm\" (UniqueName: \"kubernetes.io/projected/73a32126-0013-46c3-8562-1a23d1d207b1-kube-api-access-bjwpm\") pod \"community-operators-99cnm\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.287522 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h29qn"] Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.289584 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.301471 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dbpvk" podStartSLOduration=10.301451712 podStartE2EDuration="10.301451712s" podCreationTimestamp="2026-02-21 21:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:55.299651542 +0000 UTC m=+150.081185164" watchObservedRunningTime="2026-02-21 21:48:55.301451712 +0000 UTC m=+150.082985334" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.310750 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h29qn"] Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.359819 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9kmrb"] Feb 21 21:48:55 crc kubenswrapper[4717]: W0221 21:48:55.375419 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80125217_20f6_4337_8be2_8874b40aa10e.slice/crio-f5ba6839a4c9898a0a6a2be1ce783e056febf09b8c3bf751dee6441d4c844aea WatchSource:0}: Error finding container f5ba6839a4c9898a0a6a2be1ce783e056febf09b8c3bf751dee6441d4c844aea: Status 404 returned error can't find the container with id f5ba6839a4c9898a0a6a2be1ce783e056febf09b8c3bf751dee6441d4c844aea Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.406989 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.422619 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-utilities\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.422847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv85l\" (UniqueName: \"kubernetes.io/projected/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-kube-api-access-dv85l\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.422930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-catalog-content\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.524430 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv85l\" (UniqueName: \"kubernetes.io/projected/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-kube-api-access-dv85l\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.524488 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-catalog-content\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.524533 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-utilities\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.525068 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-utilities\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.525159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-catalog-content\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.543275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv85l\" (UniqueName: \"kubernetes.io/projected/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-kube-api-access-dv85l\") pod \"certified-operators-h29qn\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.641348 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:55 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:55 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:55 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.641416 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.646079 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.671021 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.782047 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7bn5"] Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.831124 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/450eb855-2d6d-4503-9a7e-1980d8e97346-config-volume\") pod \"450eb855-2d6d-4503-9a7e-1980d8e97346\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.831194 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/450eb855-2d6d-4503-9a7e-1980d8e97346-secret-volume\") pod \"450eb855-2d6d-4503-9a7e-1980d8e97346\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.831230 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phcjn\" (UniqueName: \"kubernetes.io/projected/450eb855-2d6d-4503-9a7e-1980d8e97346-kube-api-access-phcjn\") pod \"450eb855-2d6d-4503-9a7e-1980d8e97346\" (UID: \"450eb855-2d6d-4503-9a7e-1980d8e97346\") " Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.834754 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/450eb855-2d6d-4503-9a7e-1980d8e97346-config-volume" (OuterVolumeSpecName: "config-volume") pod "450eb855-2d6d-4503-9a7e-1980d8e97346" (UID: "450eb855-2d6d-4503-9a7e-1980d8e97346"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.842961 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450eb855-2d6d-4503-9a7e-1980d8e97346-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "450eb855-2d6d-4503-9a7e-1980d8e97346" (UID: "450eb855-2d6d-4503-9a7e-1980d8e97346"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.845101 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/450eb855-2d6d-4503-9a7e-1980d8e97346-kube-api-access-phcjn" (OuterVolumeSpecName: "kube-api-access-phcjn") pod "450eb855-2d6d-4503-9a7e-1980d8e97346" (UID: "450eb855-2d6d-4503-9a7e-1980d8e97346"). InnerVolumeSpecName "kube-api-access-phcjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.933366 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/450eb855-2d6d-4503-9a7e-1980d8e97346-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.933423 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/450eb855-2d6d-4503-9a7e-1980d8e97346-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.933437 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phcjn\" (UniqueName: \"kubernetes.io/projected/450eb855-2d6d-4503-9a7e-1980d8e97346-kube-api-access-phcjn\") on node \"crc\" DevicePath \"\"" Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.962170 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h29qn"] Feb 21 21:48:55 crc kubenswrapper[4717]: I0221 21:48:55.995824 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.141769 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99cnm"] Feb 21 21:48:56 crc kubenswrapper[4717]: W0221 21:48:56.152297 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a32126_0013_46c3_8562_1a23d1d207b1.slice/crio-c48550bb1025db850b5ac184782300c0124da0272bec3ef38eb345299d7b7cfc WatchSource:0}: Error finding container c48550bb1025db850b5ac184782300c0124da0272bec3ef38eb345299d7b7cfc: Status 404 returned error can't find the container with id c48550bb1025db850b5ac184782300c0124da0272bec3ef38eb345299d7b7cfc Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.177471 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqr6g"] Feb 21 21:48:56 crc kubenswrapper[4717]: W0221 21:48:56.208145 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ebb725_29ae_4902_9b6b_6258039bb6c0.slice/crio-a803ead3efbcf01fa35d118c075edaf7fcdbf9fcfbe2bb5132c2dca34bd7b410 WatchSource:0}: Error finding container a803ead3efbcf01fa35d118c075edaf7fcdbf9fcfbe2bb5132c2dca34bd7b410: Status 404 returned error can't find the container with id a803ead3efbcf01fa35d118c075edaf7fcdbf9fcfbe2bb5132c2dca34bd7b410 Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.230644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" event={"ID":"450eb855-2d6d-4503-9a7e-1980d8e97346","Type":"ContainerDied","Data":"c80f836e439309bb35a76fd16bfdf56c5e1cf94d3da04180ffc50506a6d8cac1"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.230705 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c80f836e439309bb35a76fd16bfdf56c5e1cf94d3da04180ffc50506a6d8cac1" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.230780 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.238000 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerStarted","Data":"8b5c3df7356dc67de506b9469536e47246c0841de14db0692458e10ef030d12c"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.238831 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" event={"ID":"72ebb725-29ae-4902-9b6b-6258039bb6c0","Type":"ContainerStarted","Data":"a803ead3efbcf01fa35d118c075edaf7fcdbf9fcfbe2bb5132c2dca34bd7b410"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.242469 4717 generic.go:334] "Generic (PLEG): container finished" podID="80125217-20f6-4337-8be2-8874b40aa10e" containerID="fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64" exitCode=0 Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.243286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerDied","Data":"fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.243318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerStarted","Data":"f5ba6839a4c9898a0a6a2be1ce783e056febf09b8c3bf751dee6441d4c844aea"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.244248 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.247762 4717 generic.go:334] "Generic (PLEG): container finished" podID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerID="12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4" exitCode=0 Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.247848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerDied","Data":"12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.247897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerStarted","Data":"083272ec1b004be045eb016dee7186ea2ea7fbd58df2af6b09d087414cedccee"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.251049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerStarted","Data":"c48550bb1025db850b5ac184782300c0124da0272bec3ef38eb345299d7b7cfc"} Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.352567 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 21 21:48:56 crc kubenswrapper[4717]: E0221 21:48:56.352889 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="450eb855-2d6d-4503-9a7e-1980d8e97346" containerName="collect-profiles" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.352909 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="450eb855-2d6d-4503-9a7e-1980d8e97346" containerName="collect-profiles" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.353042 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="450eb855-2d6d-4503-9a7e-1980d8e97346" containerName="collect-profiles" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.353507 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.356474 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.358970 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.361583 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.441287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cface17-9c96-4fff-b28a-d41043906ab2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.441366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cface17-9c96-4fff-b28a-d41043906ab2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.545394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cface17-9c96-4fff-b28a-d41043906ab2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.545472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cface17-9c96-4fff-b28a-d41043906ab2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.545635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cface17-9c96-4fff-b28a-d41043906ab2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.572236 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cface17-9c96-4fff-b28a-d41043906ab2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.615073 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:56 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:56 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:56 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.615186 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.684313 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6m7gj"] Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.685688 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.688108 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.700991 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m7gj"] Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.748623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-utilities\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.748738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-catalog-content\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.748806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjwwh\" (UniqueName: \"kubernetes.io/projected/914ea582-116d-4c4a-9d8d-34fda8fb5323-kube-api-access-gjwwh\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.830287 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.850210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-catalog-content\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.850578 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjwwh\" (UniqueName: \"kubernetes.io/projected/914ea582-116d-4c4a-9d8d-34fda8fb5323-kube-api-access-gjwwh\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.850638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-utilities\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.851205 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-utilities\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.851493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-catalog-content\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:56 crc kubenswrapper[4717]: I0221 21:48:56.875083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjwwh\" (UniqueName: \"kubernetes.io/projected/914ea582-116d-4c4a-9d8d-34fda8fb5323-kube-api-access-gjwwh\") pod \"redhat-marketplace-6m7gj\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.001320 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.084407 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q54qm"] Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.090750 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.091453 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q54qm"] Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.153907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-catalog-content\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.153979 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6npc\" (UniqueName: \"kubernetes.io/projected/280cdbf0-4686-47a5-a48e-55562853f0f6-kube-api-access-q6npc\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.155396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-utilities\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.256609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-catalog-content\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.256674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6npc\" (UniqueName: \"kubernetes.io/projected/280cdbf0-4686-47a5-a48e-55562853f0f6-kube-api-access-q6npc\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.256716 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-utilities\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.257288 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-utilities\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.257427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-catalog-content\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.266385 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m7gj"] Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.269212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" event={"ID":"72ebb725-29ae-4902-9b6b-6258039bb6c0","Type":"ContainerStarted","Data":"38d3ac1f0a3cd7335d2fc7b1e79f0b101179bcc8a2bfb416f9928edccbac5621"} Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.270401 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.274350 4717 generic.go:334] "Generic (PLEG): container finished" podID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerID="6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a" exitCode=0 Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.274409 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerDied","Data":"6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a"} Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.281773 4717 generic.go:334] "Generic (PLEG): container finished" podID="73a32126-0013-46c3-8562-1a23d1d207b1" containerID="00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025" exitCode=0 Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.283083 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerDied","Data":"00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025"} Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.284826 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 21 21:48:57 crc kubenswrapper[4717]: W0221 21:48:57.288964 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914ea582_116d_4c4a_9d8d_34fda8fb5323.slice/crio-752aae47e11cc703baa30e6fc72dfa8a5e246bfca22d8b789f1c5e86fdf65887 WatchSource:0}: Error finding container 752aae47e11cc703baa30e6fc72dfa8a5e246bfca22d8b789f1c5e86fdf65887: Status 404 returned error can't find the container with id 752aae47e11cc703baa30e6fc72dfa8a5e246bfca22d8b789f1c5e86fdf65887 Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.290035 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6npc\" (UniqueName: \"kubernetes.io/projected/280cdbf0-4686-47a5-a48e-55562853f0f6-kube-api-access-q6npc\") pod \"redhat-marketplace-q54qm\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.291796 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" podStartSLOduration=130.29178192 podStartE2EDuration="2m10.29178192s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:57.290550092 +0000 UTC m=+152.072083714" watchObservedRunningTime="2026-02-21 21:48:57.29178192 +0000 UTC m=+152.073315542" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.413212 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.605508 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cf52 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.605567 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cf52 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.605998 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cf52" podUID="3f6219e3-5e33-4c13-9c52-de506ba0b6a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.606097 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8cf52" podUID="3f6219e3-5e33-4c13-9c52-de506ba0b6a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.609744 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.615773 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:57 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:57 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:57 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.615821 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.838997 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.840736 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.843609 4717 patch_prober.go:28] interesting pod/console-f9d7485db-47lf7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.843700 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-47lf7" podUID="17bb07e6-67dd-4cc5-b979-9ef794228e81" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 21 21:48:57 crc kubenswrapper[4717]: I0221 21:48:57.869264 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q54qm"] Feb 21 21:48:57 crc kubenswrapper[4717]: W0221 21:48:57.895414 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280cdbf0_4686_47a5_a48e_55562853f0f6.slice/crio-7e8b7594d596c20431c54865ceb53696fab76ccca82852ceac81ec677e8f3952 WatchSource:0}: Error finding container 7e8b7594d596c20431c54865ceb53696fab76ccca82852ceac81ec677e8f3952: Status 404 returned error can't find the container with id 7e8b7594d596c20431c54865ceb53696fab76ccca82852ceac81ec677e8f3952 Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.081524 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v7tr6"] Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.082518 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.086815 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.090962 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.099887 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v7tr6"] Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.102781 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.134297 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.181617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-catalog-content\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.181765 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvwr\" (UniqueName: \"kubernetes.io/projected/ea0c5c67-a77c-463b-8339-a73a7a9605e1-kube-api-access-lxvwr\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.181799 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-utilities\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.234849 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.235671 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.253542 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.288670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvwr\" (UniqueName: \"kubernetes.io/projected/ea0c5c67-a77c-463b-8339-a73a7a9605e1-kube-api-access-lxvwr\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.288778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-utilities\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.288906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-catalog-content\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.290782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-catalog-content\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.290793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-utilities\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.337584 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvwr\" (UniqueName: \"kubernetes.io/projected/ea0c5c67-a77c-463b-8339-a73a7a9605e1-kube-api-access-lxvwr\") pod \"redhat-operators-v7tr6\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.342144 4717 generic.go:334] "Generic (PLEG): container finished" podID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerID="7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068" exitCode=0 Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.342228 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q54qm" event={"ID":"280cdbf0-4686-47a5-a48e-55562853f0f6","Type":"ContainerDied","Data":"7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068"} Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.342266 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q54qm" event={"ID":"280cdbf0-4686-47a5-a48e-55562853f0f6","Type":"ContainerStarted","Data":"7e8b7594d596c20431c54865ceb53696fab76ccca82852ceac81ec677e8f3952"} Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.347420 4717 generic.go:334] "Generic (PLEG): container finished" podID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerID="44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf" exitCode=0 Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.347656 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m7gj" event={"ID":"914ea582-116d-4c4a-9d8d-34fda8fb5323","Type":"ContainerDied","Data":"44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf"} Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.347683 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m7gj" event={"ID":"914ea582-116d-4c4a-9d8d-34fda8fb5323","Type":"ContainerStarted","Data":"752aae47e11cc703baa30e6fc72dfa8a5e246bfca22d8b789f1c5e86fdf65887"} Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.353111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7cface17-9c96-4fff-b28a-d41043906ab2","Type":"ContainerStarted","Data":"3b77b741f89b5960d8336a04aa7317bef2904d7da02710355b6fe678ba198d51"} Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.353205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7cface17-9c96-4fff-b28a-d41043906ab2","Type":"ContainerStarted","Data":"0a69b7f975c7a3e210fd08ba8574e29e6e2cfaaac6e73b5c173f0ddff76d5682"} Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.360774 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-c5sr8" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.362726 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cvrbm" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.438318 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.438297523 podStartE2EDuration="2.438297523s" podCreationTimestamp="2026-02-21 21:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:48:58.43813708 +0000 UTC m=+153.219670702" watchObservedRunningTime="2026-02-21 21:48:58.438297523 +0000 UTC m=+153.219831145" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.466612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.496261 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rrfqw"] Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.498002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.544080 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrfqw"] Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.603062 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-catalog-content\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.603165 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-utilities\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.603193 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdmml\" (UniqueName: \"kubernetes.io/projected/7f963242-7453-42b2-83a3-6425582a2b9c-kube-api-access-rdmml\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.617510 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:58 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:58 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:58 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.617569 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.704572 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-utilities\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.704639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdmml\" (UniqueName: \"kubernetes.io/projected/7f963242-7453-42b2-83a3-6425582a2b9c-kube-api-access-rdmml\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.704691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-catalog-content\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.710518 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-catalog-content\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.713091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-utilities\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.746841 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdmml\" (UniqueName: \"kubernetes.io/projected/7f963242-7453-42b2-83a3-6425582a2b9c-kube-api-access-rdmml\") pod \"redhat-operators-rrfqw\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:58 crc kubenswrapper[4717]: I0221 21:48:58.912017 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.087476 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v7tr6"] Feb 21 21:48:59 crc kubenswrapper[4717]: W0221 21:48:59.106126 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0c5c67_a77c_463b_8339_a73a7a9605e1.slice/crio-04add27162731e928224a60d16d864824f02733524042a567bb0ab9a56cbdcec WatchSource:0}: Error finding container 04add27162731e928224a60d16d864824f02733524042a567bb0ab9a56cbdcec: Status 404 returned error can't find the container with id 04add27162731e928224a60d16d864824f02733524042a567bb0ab9a56cbdcec Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.195695 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrfqw"] Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.370172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerStarted","Data":"164718ccc9353ee8cf934d19ff954f43029ba5409da67c703eb6d5405b0f7fb8"} Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.376275 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cface17-9c96-4fff-b28a-d41043906ab2" containerID="3b77b741f89b5960d8336a04aa7317bef2904d7da02710355b6fe678ba198d51" exitCode=0 Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.376332 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7cface17-9c96-4fff-b28a-d41043906ab2","Type":"ContainerDied","Data":"3b77b741f89b5960d8336a04aa7317bef2904d7da02710355b6fe678ba198d51"} Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.382054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerStarted","Data":"04add27162731e928224a60d16d864824f02733524042a567bb0ab9a56cbdcec"} Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.613513 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:48:59 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:48:59 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:48:59 crc kubenswrapper[4717]: healthz check failed Feb 21 21:48:59 crc kubenswrapper[4717]: I0221 21:48:59.613594 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.322582 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.323386 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.328130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.328137 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.344033 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.432795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45be6352-187d-42b5-8465-4ef03f908a9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.432897 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45be6352-187d-42b5-8465-4ef03f908a9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.452089 4717 generic.go:334] "Generic (PLEG): container finished" podID="7f963242-7453-42b2-83a3-6425582a2b9c" containerID="517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178" exitCode=0 Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.452163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerDied","Data":"517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178"} Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.458746 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerID="dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123" exitCode=0 Feb 21 21:49:00 crc kubenswrapper[4717]: I0221 21:49:00.458847 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerDied","Data":"dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123"} Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.696278 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45be6352-187d-42b5-8465-4ef03f908a9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.696436 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45be6352-187d-42b5-8465-4ef03f908a9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.705816 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45be6352-187d-42b5-8465-4ef03f908a9a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.731678 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:49:01 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:49:01 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:49:01 crc kubenswrapper[4717]: healthz check failed Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.732113 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.782091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45be6352-187d-42b5-8465-4ef03f908a9a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:01 crc kubenswrapper[4717]: I0221 21:49:01.844160 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.207689 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.233441 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cface17-9c96-4fff-b28a-d41043906ab2-kube-api-access\") pod \"7cface17-9c96-4fff-b28a-d41043906ab2\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.233586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cface17-9c96-4fff-b28a-d41043906ab2-kubelet-dir\") pod \"7cface17-9c96-4fff-b28a-d41043906ab2\" (UID: \"7cface17-9c96-4fff-b28a-d41043906ab2\") " Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.233956 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cface17-9c96-4fff-b28a-d41043906ab2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7cface17-9c96-4fff-b28a-d41043906ab2" (UID: "7cface17-9c96-4fff-b28a-d41043906ab2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.244454 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cface17-9c96-4fff-b28a-d41043906ab2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7cface17-9c96-4fff-b28a-d41043906ab2" (UID: "7cface17-9c96-4fff-b28a-d41043906ab2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.335425 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cface17-9c96-4fff-b28a-d41043906ab2-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.335468 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7cface17-9c96-4fff-b28a-d41043906ab2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.480369 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.616227 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:49:02 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:49:02 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:49:02 crc kubenswrapper[4717]: healthz check failed Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.616332 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.802170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"45be6352-187d-42b5-8465-4ef03f908a9a","Type":"ContainerStarted","Data":"75f9857a6af8eb05277d4be9c3af47859c9b5787761582e8d65e7d319d67c168"} Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.804188 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7cface17-9c96-4fff-b28a-d41043906ab2","Type":"ContainerDied","Data":"0a69b7f975c7a3e210fd08ba8574e29e6e2cfaaac6e73b5c173f0ddff76d5682"} Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.804236 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a69b7f975c7a3e210fd08ba8574e29e6e2cfaaac6e73b5c173f0ddff76d5682" Feb 21 21:49:02 crc kubenswrapper[4717]: I0221 21:49:02.804320 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 21 21:49:03 crc kubenswrapper[4717]: I0221 21:49:03.249789 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dh4pw" Feb 21 21:49:03 crc kubenswrapper[4717]: I0221 21:49:03.614926 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:49:03 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:49:03 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:49:03 crc kubenswrapper[4717]: healthz check failed Feb 21 21:49:03 crc kubenswrapper[4717]: I0221 21:49:03.615257 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:49:03 crc kubenswrapper[4717]: I0221 21:49:03.825733 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"45be6352-187d-42b5-8465-4ef03f908a9a","Type":"ContainerStarted","Data":"f7c1be9a839be9a6e7e559a20bde5820708bd1a701eb3fb274c3269f31ccecd4"} Feb 21 21:49:04 crc kubenswrapper[4717]: I0221 21:49:04.615146 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:49:04 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Feb 21 21:49:04 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:49:04 crc kubenswrapper[4717]: healthz check failed Feb 21 21:49:04 crc kubenswrapper[4717]: I0221 21:49:04.615288 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:49:04 crc kubenswrapper[4717]: I0221 21:49:04.875196 4717 generic.go:334] "Generic (PLEG): container finished" podID="45be6352-187d-42b5-8465-4ef03f908a9a" containerID="f7c1be9a839be9a6e7e559a20bde5820708bd1a701eb3fb274c3269f31ccecd4" exitCode=0 Feb 21 21:49:04 crc kubenswrapper[4717]: I0221 21:49:04.875415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"45be6352-187d-42b5-8465-4ef03f908a9a","Type":"ContainerDied","Data":"f7c1be9a839be9a6e7e559a20bde5820708bd1a701eb3fb274c3269f31ccecd4"} Feb 21 21:49:05 crc kubenswrapper[4717]: I0221 21:49:05.619772 4717 patch_prober.go:28] interesting pod/router-default-5444994796-g8wzz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 21 21:49:05 crc kubenswrapper[4717]: [+]has-synced ok Feb 21 21:49:05 crc kubenswrapper[4717]: [+]process-running ok Feb 21 21:49:05 crc kubenswrapper[4717]: healthz check failed Feb 21 21:49:05 crc kubenswrapper[4717]: I0221 21:49:05.619840 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g8wzz" podUID="dbd91a09-921f-4585-986a-90fd4a111781" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 21 21:49:06 crc kubenswrapper[4717]: I0221 21:49:06.616908 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:49:06 crc kubenswrapper[4717]: I0221 21:49:06.621491 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-g8wzz" Feb 21 21:49:07 crc kubenswrapper[4717]: I0221 21:49:07.611911 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8cf52" Feb 21 21:49:07 crc kubenswrapper[4717]: I0221 21:49:07.843609 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:49:07 crc kubenswrapper[4717]: I0221 21:49:07.846988 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:49:09 crc kubenswrapper[4717]: I0221 21:49:09.062292 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:49:09 crc kubenswrapper[4717]: I0221 21:49:09.062662 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:49:10 crc kubenswrapper[4717]: I0221 21:49:10.472915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:49:10 crc kubenswrapper[4717]: I0221 21:49:10.482328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8203b79d-1367-43b6-8567-797ec1b0c09b-metrics-certs\") pod \"network-metrics-daemon-gt2bg\" (UID: \"8203b79d-1367-43b6-8567-797ec1b0c09b\") " pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:49:10 crc kubenswrapper[4717]: I0221 21:49:10.631668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gt2bg" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.523760 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.621518 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45be6352-187d-42b5-8465-4ef03f908a9a-kubelet-dir\") pod \"45be6352-187d-42b5-8465-4ef03f908a9a\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.621689 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45be6352-187d-42b5-8465-4ef03f908a9a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "45be6352-187d-42b5-8465-4ef03f908a9a" (UID: "45be6352-187d-42b5-8465-4ef03f908a9a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.621720 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45be6352-187d-42b5-8465-4ef03f908a9a-kube-api-access\") pod \"45be6352-187d-42b5-8465-4ef03f908a9a\" (UID: \"45be6352-187d-42b5-8465-4ef03f908a9a\") " Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.622465 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/45be6352-187d-42b5-8465-4ef03f908a9a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.626933 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45be6352-187d-42b5-8465-4ef03f908a9a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "45be6352-187d-42b5-8465-4ef03f908a9a" (UID: "45be6352-187d-42b5-8465-4ef03f908a9a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.723910 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45be6352-187d-42b5-8465-4ef03f908a9a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.964750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"45be6352-187d-42b5-8465-4ef03f908a9a","Type":"ContainerDied","Data":"75f9857a6af8eb05277d4be9c3af47859c9b5787761582e8d65e7d319d67c168"} Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.964817 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f9857a6af8eb05277d4be9c3af47859c9b5787761582e8d65e7d319d67c168" Feb 21 21:49:13 crc kubenswrapper[4717]: I0221 21:49:13.964891 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 21 21:49:15 crc kubenswrapper[4717]: I0221 21:49:15.275809 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:49:24 crc kubenswrapper[4717]: I0221 21:49:24.133291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 21 21:49:26 crc kubenswrapper[4717]: I0221 21:49:26.071652 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-76zmn"] Feb 21 21:49:27 crc kubenswrapper[4717]: E0221 21:49:27.609397 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 21 21:49:27 crc kubenswrapper[4717]: E0221 21:49:27.612493 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6npc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-q54qm_openshift-marketplace(280cdbf0-4686-47a5-a48e-55562853f0f6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 21 21:49:27 crc kubenswrapper[4717]: E0221 21:49:27.614251 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-q54qm" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.020454 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gt2bg"] Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.057496 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerStarted","Data":"1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76"} Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.062962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerStarted","Data":"52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6"} Feb 21 21:49:28 crc kubenswrapper[4717]: W0221 21:49:28.064950 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8203b79d_1367_43b6_8567_797ec1b0c09b.slice/crio-eadbec3bcb1ece64885ecbcfe2b0c13f0fb4f64b9973b67d3f831d5f73f04f7d WatchSource:0}: Error finding container eadbec3bcb1ece64885ecbcfe2b0c13f0fb4f64b9973b67d3f831d5f73f04f7d: Status 404 returned error can't find the container with id eadbec3bcb1ece64885ecbcfe2b0c13f0fb4f64b9973b67d3f831d5f73f04f7d Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.065668 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerStarted","Data":"22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2"} Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.071026 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerStarted","Data":"350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d"} Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.072682 4717 generic.go:334] "Generic (PLEG): container finished" podID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerID="9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc" exitCode=0 Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.072810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m7gj" event={"ID":"914ea582-116d-4c4a-9d8d-34fda8fb5323","Type":"ContainerDied","Data":"9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc"} Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.074546 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerStarted","Data":"91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1"} Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.079030 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerStarted","Data":"663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4"} Feb 21 21:49:28 crc kubenswrapper[4717]: E0221 21:49:28.106349 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-q54qm" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" Feb 21 21:49:28 crc kubenswrapper[4717]: I0221 21:49:28.131545 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p5kv" Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.088220 4717 generic.go:334] "Generic (PLEG): container finished" podID="73a32126-0013-46c3-8562-1a23d1d207b1" containerID="663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4" exitCode=0 Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.088415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerDied","Data":"663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.104905 4717 generic.go:334] "Generic (PLEG): container finished" podID="80125217-20f6-4337-8be2-8874b40aa10e" containerID="52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6" exitCode=0 Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.105091 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerDied","Data":"52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.120034 4717 generic.go:334] "Generic (PLEG): container finished" podID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerID="1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76" exitCode=0 Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.120106 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerDied","Data":"1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.123471 4717 generic.go:334] "Generic (PLEG): container finished" podID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerID="22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2" exitCode=0 Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.123694 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerDied","Data":"22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.127824 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerID="350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d" exitCode=0 Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.127936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerDied","Data":"350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.149319 4717 generic.go:334] "Generic (PLEG): container finished" podID="7f963242-7453-42b2-83a3-6425582a2b9c" containerID="91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1" exitCode=0 Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.150263 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerDied","Data":"91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.156318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" event={"ID":"8203b79d-1367-43b6-8567-797ec1b0c09b","Type":"ContainerStarted","Data":"a9570028177b1489335bc14445312881c7d760cd501735732d6b17864952f665"} Feb 21 21:49:29 crc kubenswrapper[4717]: I0221 21:49:29.156397 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" event={"ID":"8203b79d-1367-43b6-8567-797ec1b0c09b","Type":"ContainerStarted","Data":"eadbec3bcb1ece64885ecbcfe2b0c13f0fb4f64b9973b67d3f831d5f73f04f7d"} Feb 21 21:49:30 crc kubenswrapper[4717]: I0221 21:49:30.166609 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m7gj" event={"ID":"914ea582-116d-4c4a-9d8d-34fda8fb5323","Type":"ContainerStarted","Data":"422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637"} Feb 21 21:49:30 crc kubenswrapper[4717]: I0221 21:49:30.168406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gt2bg" event={"ID":"8203b79d-1367-43b6-8567-797ec1b0c09b","Type":"ContainerStarted","Data":"65c22c3ed98f98c5d227c04a0aa45c747dde668f4d66ed6183a859529714d8c6"} Feb 21 21:49:30 crc kubenswrapper[4717]: I0221 21:49:30.193445 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gt2bg" podStartSLOduration=163.193413612 podStartE2EDuration="2m43.193413612s" podCreationTimestamp="2026-02-21 21:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:49:30.182257768 +0000 UTC m=+184.963791400" watchObservedRunningTime="2026-02-21 21:49:30.193413612 +0000 UTC m=+184.974947264" Feb 21 21:49:31 crc kubenswrapper[4717]: I0221 21:49:31.204678 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6m7gj" podStartSLOduration=3.921402382 podStartE2EDuration="35.204647568s" podCreationTimestamp="2026-02-21 21:48:56 +0000 UTC" firstStartedPulling="2026-02-21 21:48:58.350246049 +0000 UTC m=+153.131779661" lastFinishedPulling="2026-02-21 21:49:29.633491225 +0000 UTC m=+184.415024847" observedRunningTime="2026-02-21 21:49:31.198708802 +0000 UTC m=+185.980242414" watchObservedRunningTime="2026-02-21 21:49:31.204647568 +0000 UTC m=+185.986181230" Feb 21 21:49:32 crc kubenswrapper[4717]: I0221 21:49:32.182348 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerStarted","Data":"326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e"} Feb 21 21:49:33 crc kubenswrapper[4717]: I0221 21:49:33.210631 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99cnm" podStartSLOduration=3.95328745 podStartE2EDuration="38.21060899s" podCreationTimestamp="2026-02-21 21:48:55 +0000 UTC" firstStartedPulling="2026-02-21 21:48:57.283798688 +0000 UTC m=+152.065332310" lastFinishedPulling="2026-02-21 21:49:31.541120188 +0000 UTC m=+186.322653850" observedRunningTime="2026-02-21 21:49:33.206722102 +0000 UTC m=+187.988255744" watchObservedRunningTime="2026-02-21 21:49:33.21060899 +0000 UTC m=+187.992142632" Feb 21 21:49:34 crc kubenswrapper[4717]: I0221 21:49:34.196467 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerStarted","Data":"5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f"} Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.204088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerStarted","Data":"339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017"} Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.207136 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerStarted","Data":"bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b"} Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.209544 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerStarted","Data":"fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84"} Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.212142 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerStarted","Data":"0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6"} Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.230418 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.230474 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.231982 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rrfqw" podStartSLOduration=3.203716549 podStartE2EDuration="37.231961303s" podCreationTimestamp="2026-02-21 21:48:58 +0000 UTC" firstStartedPulling="2026-02-21 21:49:00.459368321 +0000 UTC m=+155.240901943" lastFinishedPulling="2026-02-21 21:49:34.487613075 +0000 UTC m=+189.269146697" observedRunningTime="2026-02-21 21:49:35.228220438 +0000 UTC m=+190.009754060" watchObservedRunningTime="2026-02-21 21:49:35.231961303 +0000 UTC m=+190.013494925" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.252315 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7bn5" podStartSLOduration=3.097306932 podStartE2EDuration="41.252289787s" podCreationTimestamp="2026-02-21 21:48:54 +0000 UTC" firstStartedPulling="2026-02-21 21:48:56.249200631 +0000 UTC m=+151.030734253" lastFinishedPulling="2026-02-21 21:49:34.404183486 +0000 UTC m=+189.185717108" observedRunningTime="2026-02-21 21:49:35.250078426 +0000 UTC m=+190.031612048" watchObservedRunningTime="2026-02-21 21:49:35.252289787 +0000 UTC m=+190.033823419" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.305152 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9kmrb" podStartSLOduration=3.10387906 podStartE2EDuration="41.305128299s" podCreationTimestamp="2026-02-21 21:48:54 +0000 UTC" firstStartedPulling="2026-02-21 21:48:56.2438961 +0000 UTC m=+151.025429722" lastFinishedPulling="2026-02-21 21:49:34.445145339 +0000 UTC m=+189.226678961" observedRunningTime="2026-02-21 21:49:35.300439063 +0000 UTC m=+190.081972685" watchObservedRunningTime="2026-02-21 21:49:35.305128299 +0000 UTC m=+190.086661921" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.305659 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h29qn" podStartSLOduration=4.168381758 podStartE2EDuration="40.305654422s" podCreationTimestamp="2026-02-21 21:48:55 +0000 UTC" firstStartedPulling="2026-02-21 21:48:57.281172738 +0000 UTC m=+152.062706360" lastFinishedPulling="2026-02-21 21:49:33.418445402 +0000 UTC m=+188.199979024" observedRunningTime="2026-02-21 21:49:35.276335044 +0000 UTC m=+190.057868666" watchObservedRunningTime="2026-02-21 21:49:35.305654422 +0000 UTC m=+190.087188044" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.321508 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v7tr6" podStartSLOduration=3.240858654 podStartE2EDuration="37.321487632s" podCreationTimestamp="2026-02-21 21:48:58 +0000 UTC" firstStartedPulling="2026-02-21 21:49:00.463452764 +0000 UTC m=+155.244986386" lastFinishedPulling="2026-02-21 21:49:34.544081752 +0000 UTC m=+189.325615364" observedRunningTime="2026-02-21 21:49:35.317944311 +0000 UTC m=+190.099477933" watchObservedRunningTime="2026-02-21 21:49:35.321487632 +0000 UTC m=+190.103021254" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.408600 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.408670 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.454843 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.647694 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:49:35 crc kubenswrapper[4717]: I0221 21:49:35.647790 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:49:36 crc kubenswrapper[4717]: I0221 21:49:36.265429 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:49:36 crc kubenswrapper[4717]: I0221 21:49:36.399134 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-d7bn5" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="registry-server" probeResult="failure" output=< Feb 21 21:49:36 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 21:49:36 crc kubenswrapper[4717]: > Feb 21 21:49:36 crc kubenswrapper[4717]: I0221 21:49:36.688711 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-h29qn" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="registry-server" probeResult="failure" output=< Feb 21 21:49:36 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 21:49:36 crc kubenswrapper[4717]: > Feb 21 21:49:37 crc kubenswrapper[4717]: I0221 21:49:37.002028 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:49:37 crc kubenswrapper[4717]: I0221 21:49:37.002783 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:49:37 crc kubenswrapper[4717]: I0221 21:49:37.039876 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:49:37 crc kubenswrapper[4717]: I0221 21:49:37.270662 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:49:38 crc kubenswrapper[4717]: I0221 21:49:38.466888 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:49:38 crc kubenswrapper[4717]: I0221 21:49:38.466963 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:49:38 crc kubenswrapper[4717]: I0221 21:49:38.913982 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:49:38 crc kubenswrapper[4717]: I0221 21:49:38.914549 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.062708 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.062803 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.299847 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-99cnm"] Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.300188 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-99cnm" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="registry-server" containerID="cri-o://326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e" gracePeriod=2 Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.307412 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 21 21:49:39 crc kubenswrapper[4717]: E0221 21:49:39.307759 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45be6352-187d-42b5-8465-4ef03f908a9a" containerName="pruner" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.307828 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="45be6352-187d-42b5-8465-4ef03f908a9a" containerName="pruner" Feb 21 21:49:39 crc kubenswrapper[4717]: E0221 21:49:39.307923 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cface17-9c96-4fff-b28a-d41043906ab2" containerName="pruner" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.307977 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cface17-9c96-4fff-b28a-d41043906ab2" containerName="pruner" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.308152 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cface17-9c96-4fff-b28a-d41043906ab2" containerName="pruner" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.308223 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="45be6352-187d-42b5-8465-4ef03f908a9a" containerName="pruner" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.308692 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.313022 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.313333 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.318682 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.319212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be248144-950a-4833-be06-53d375c61bd6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.319302 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be248144-950a-4833-be06-53d375c61bd6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.423100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be248144-950a-4833-be06-53d375c61bd6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.423179 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be248144-950a-4833-be06-53d375c61bd6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.423316 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be248144-950a-4833-be06-53d375c61bd6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.445616 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be248144-950a-4833-be06-53d375c61bd6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.513256 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v7tr6" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="registry-server" probeResult="failure" output=< Feb 21 21:49:39 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 21:49:39 crc kubenswrapper[4717]: > Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.650784 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.763828 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.829496 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjwpm\" (UniqueName: \"kubernetes.io/projected/73a32126-0013-46c3-8562-1a23d1d207b1-kube-api-access-bjwpm\") pod \"73a32126-0013-46c3-8562-1a23d1d207b1\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.829564 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-catalog-content\") pod \"73a32126-0013-46c3-8562-1a23d1d207b1\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.829741 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-utilities\") pod \"73a32126-0013-46c3-8562-1a23d1d207b1\" (UID: \"73a32126-0013-46c3-8562-1a23d1d207b1\") " Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.830765 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-utilities" (OuterVolumeSpecName: "utilities") pod "73a32126-0013-46c3-8562-1a23d1d207b1" (UID: "73a32126-0013-46c3-8562-1a23d1d207b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.834987 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a32126-0013-46c3-8562-1a23d1d207b1-kube-api-access-bjwpm" (OuterVolumeSpecName: "kube-api-access-bjwpm") pod "73a32126-0013-46c3-8562-1a23d1d207b1" (UID: "73a32126-0013-46c3-8562-1a23d1d207b1"). InnerVolumeSpecName "kube-api-access-bjwpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.904405 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a32126-0013-46c3-8562-1a23d1d207b1" (UID: "73a32126-0013-46c3-8562-1a23d1d207b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.909249 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 21 21:49:39 crc kubenswrapper[4717]: W0221 21:49:39.919148 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbe248144_950a_4833_be06_53d375c61bd6.slice/crio-c1d04986f7d7113f088cbc455becefa80bd0757d22a8284e3bc85ee35e50f72d WatchSource:0}: Error finding container c1d04986f7d7113f088cbc455becefa80bd0757d22a8284e3bc85ee35e50f72d: Status 404 returned error can't find the container with id c1d04986f7d7113f088cbc455becefa80bd0757d22a8284e3bc85ee35e50f72d Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.930757 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.930785 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjwpm\" (UniqueName: \"kubernetes.io/projected/73a32126-0013-46c3-8562-1a23d1d207b1-kube-api-access-bjwpm\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.930795 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a32126-0013-46c3-8562-1a23d1d207b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:39 crc kubenswrapper[4717]: I0221 21:49:39.960335 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rrfqw" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="registry-server" probeResult="failure" output=< Feb 21 21:49:39 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 21:49:39 crc kubenswrapper[4717]: > Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.244698 4717 generic.go:334] "Generic (PLEG): container finished" podID="73a32126-0013-46c3-8562-1a23d1d207b1" containerID="326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e" exitCode=0 Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.244840 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99cnm" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.245553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerDied","Data":"326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e"} Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.245586 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99cnm" event={"ID":"73a32126-0013-46c3-8562-1a23d1d207b1","Type":"ContainerDied","Data":"c48550bb1025db850b5ac184782300c0124da0272bec3ef38eb345299d7b7cfc"} Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.245603 4717 scope.go:117] "RemoveContainer" containerID="326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.248523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"be248144-950a-4833-be06-53d375c61bd6","Type":"ContainerStarted","Data":"c1d04986f7d7113f088cbc455becefa80bd0757d22a8284e3bc85ee35e50f72d"} Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.267219 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-99cnm"] Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.267681 4717 scope.go:117] "RemoveContainer" containerID="663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.270656 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-99cnm"] Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.286983 4717 scope.go:117] "RemoveContainer" containerID="00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.307920 4717 scope.go:117] "RemoveContainer" containerID="326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e" Feb 21 21:49:40 crc kubenswrapper[4717]: E0221 21:49:40.308466 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e\": container with ID starting with 326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e not found: ID does not exist" containerID="326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.308495 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e"} err="failed to get container status \"326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e\": rpc error: code = NotFound desc = could not find container \"326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e\": container with ID starting with 326e9b329b36657e4860a550e3c66a0a416c2e9ba52868d907586de196047b5e not found: ID does not exist" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.308538 4717 scope.go:117] "RemoveContainer" containerID="663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4" Feb 21 21:49:40 crc kubenswrapper[4717]: E0221 21:49:40.309037 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4\": container with ID starting with 663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4 not found: ID does not exist" containerID="663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.309073 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4"} err="failed to get container status \"663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4\": rpc error: code = NotFound desc = could not find container \"663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4\": container with ID starting with 663d5985cf627bdc61f6264f1268b80b99a276363c895fb9edb69313a2e0b0b4 not found: ID does not exist" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.309098 4717 scope.go:117] "RemoveContainer" containerID="00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025" Feb 21 21:49:40 crc kubenswrapper[4717]: E0221 21:49:40.309700 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025\": container with ID starting with 00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025 not found: ID does not exist" containerID="00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025" Feb 21 21:49:40 crc kubenswrapper[4717]: I0221 21:49:40.309726 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025"} err="failed to get container status \"00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025\": rpc error: code = NotFound desc = could not find container \"00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025\": container with ID starting with 00c2a1ae8558363f61608b5fff166b1a4452394fd57a5c53c3406daf18f83025 not found: ID does not exist" Feb 21 21:49:41 crc kubenswrapper[4717]: I0221 21:49:41.259736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"be248144-950a-4833-be06-53d375c61bd6","Type":"ContainerStarted","Data":"a30d03fe5bc1783f1b37b0b3f1c75bc7c6a5d6ff0b8ae7429c6c87b9cb1f1ae0"} Feb 21 21:49:41 crc kubenswrapper[4717]: I0221 21:49:41.282515 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.282494735 podStartE2EDuration="2.282494735s" podCreationTimestamp="2026-02-21 21:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:49:41.279382404 +0000 UTC m=+196.060916026" watchObservedRunningTime="2026-02-21 21:49:41.282494735 +0000 UTC m=+196.064028357" Feb 21 21:49:41 crc kubenswrapper[4717]: I0221 21:49:41.986817 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" path="/var/lib/kubelet/pods/73a32126-0013-46c3-8562-1a23d1d207b1/volumes" Feb 21 21:49:42 crc kubenswrapper[4717]: I0221 21:49:42.271808 4717 generic.go:334] "Generic (PLEG): container finished" podID="be248144-950a-4833-be06-53d375c61bd6" containerID="a30d03fe5bc1783f1b37b0b3f1c75bc7c6a5d6ff0b8ae7429c6c87b9cb1f1ae0" exitCode=0 Feb 21 21:49:42 crc kubenswrapper[4717]: I0221 21:49:42.271902 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"be248144-950a-4833-be06-53d375c61bd6","Type":"ContainerDied","Data":"a30d03fe5bc1783f1b37b0b3f1c75bc7c6a5d6ff0b8ae7429c6c87b9cb1f1ae0"} Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.279261 4717 generic.go:334] "Generic (PLEG): container finished" podID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerID="fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748" exitCode=0 Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.279343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q54qm" event={"ID":"280cdbf0-4686-47a5-a48e-55562853f0f6","Type":"ContainerDied","Data":"fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748"} Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.628486 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.686381 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be248144-950a-4833-be06-53d375c61bd6-kubelet-dir\") pod \"be248144-950a-4833-be06-53d375c61bd6\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.686657 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be248144-950a-4833-be06-53d375c61bd6-kube-api-access\") pod \"be248144-950a-4833-be06-53d375c61bd6\" (UID: \"be248144-950a-4833-be06-53d375c61bd6\") " Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.690013 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be248144-950a-4833-be06-53d375c61bd6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "be248144-950a-4833-be06-53d375c61bd6" (UID: "be248144-950a-4833-be06-53d375c61bd6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.694646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be248144-950a-4833-be06-53d375c61bd6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "be248144-950a-4833-be06-53d375c61bd6" (UID: "be248144-950a-4833-be06-53d375c61bd6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.789227 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be248144-950a-4833-be06-53d375c61bd6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:43 crc kubenswrapper[4717]: I0221 21:49:43.789290 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be248144-950a-4833-be06-53d375c61bd6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.295053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"be248144-950a-4833-be06-53d375c61bd6","Type":"ContainerDied","Data":"c1d04986f7d7113f088cbc455becefa80bd0757d22a8284e3bc85ee35e50f72d"} Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.295541 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d04986f7d7113f088cbc455becefa80bd0757d22a8284e3bc85ee35e50f72d" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.295132 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.301389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q54qm" event={"ID":"280cdbf0-4686-47a5-a48e-55562853f0f6","Type":"ContainerStarted","Data":"de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817"} Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.333971 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q54qm" podStartSLOduration=1.9515600549999998 podStartE2EDuration="47.333940049s" podCreationTimestamp="2026-02-21 21:48:57 +0000 UTC" firstStartedPulling="2026-02-21 21:48:58.344735614 +0000 UTC m=+153.126269236" lastFinishedPulling="2026-02-21 21:49:43.727115578 +0000 UTC m=+198.508649230" observedRunningTime="2026-02-21 21:49:44.330665505 +0000 UTC m=+199.112199197" watchObservedRunningTime="2026-02-21 21:49:44.333940049 +0000 UTC m=+199.115473741" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.701309 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 21 21:49:44 crc kubenswrapper[4717]: E0221 21:49:44.704200 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="extract-content" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.704426 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="extract-content" Feb 21 21:49:44 crc kubenswrapper[4717]: E0221 21:49:44.704581 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="extract-utilities" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.704703 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="extract-utilities" Feb 21 21:49:44 crc kubenswrapper[4717]: E0221 21:49:44.704898 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be248144-950a-4833-be06-53d375c61bd6" containerName="pruner" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.705225 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="be248144-950a-4833-be06-53d375c61bd6" containerName="pruner" Feb 21 21:49:44 crc kubenswrapper[4717]: E0221 21:49:44.705385 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="registry-server" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.705521 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="registry-server" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.705836 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="be248144-950a-4833-be06-53d375c61bd6" containerName="pruner" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.706055 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a32126-0013-46c3-8562-1a23d1d207b1" containerName="registry-server" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.706947 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.713633 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.716761 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.717046 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.803133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-var-lock\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.803185 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kube-api-access\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.803209 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.905245 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-var-lock\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.905323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kube-api-access\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.905359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.905398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-var-lock\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.905476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:44 crc kubenswrapper[4717]: I0221 21:49:44.935798 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kube-api-access\") pod \"installer-9-crc\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.033165 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.033260 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.033472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.087604 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.296535 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.357020 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 21 21:49:45 crc kubenswrapper[4717]: W0221 21:49:45.378837 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod65a6a817_09f9_4c86_a91a_6d4b695cedd1.slice/crio-d26df56327ff1083375876437479f3afdad5dd5d6c62fdc8df5691cdbf5d4656 WatchSource:0}: Error finding container d26df56327ff1083375876437479f3afdad5dd5d6c62fdc8df5691cdbf5d4656: Status 404 returned error can't find the container with id d26df56327ff1083375876437479f3afdad5dd5d6c62fdc8df5691cdbf5d4656 Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.380736 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.382053 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.732076 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:49:45 crc kubenswrapper[4717]: I0221 21:49:45.840840 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:49:46 crc kubenswrapper[4717]: I0221 21:49:46.318657 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"65a6a817-09f9-4c86-a91a-6d4b695cedd1","Type":"ContainerStarted","Data":"8dfe27d2f9908c9888ca788054a73153cd44fbd4e15a55de1d50454f8f9ef383"} Feb 21 21:49:46 crc kubenswrapper[4717]: I0221 21:49:46.318742 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"65a6a817-09f9-4c86-a91a-6d4b695cedd1","Type":"ContainerStarted","Data":"d26df56327ff1083375876437479f3afdad5dd5d6c62fdc8df5691cdbf5d4656"} Feb 21 21:49:46 crc kubenswrapper[4717]: I0221 21:49:46.341162 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.341132193 podStartE2EDuration="2.341132193s" podCreationTimestamp="2026-02-21 21:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:49:46.336374706 +0000 UTC m=+201.117908368" watchObservedRunningTime="2026-02-21 21:49:46.341132193 +0000 UTC m=+201.122665845" Feb 21 21:49:47 crc kubenswrapper[4717]: I0221 21:49:47.414907 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:49:47 crc kubenswrapper[4717]: I0221 21:49:47.414968 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:49:47 crc kubenswrapper[4717]: I0221 21:49:47.485253 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:49:48 crc kubenswrapper[4717]: I0221 21:49:48.380517 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:49:48 crc kubenswrapper[4717]: I0221 21:49:48.529797 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:49:48 crc kubenswrapper[4717]: I0221 21:49:48.600739 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:49:48 crc kubenswrapper[4717]: I0221 21:49:48.987660 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:49:49 crc kubenswrapper[4717]: I0221 21:49:49.034524 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:49:49 crc kubenswrapper[4717]: I0221 21:49:49.499770 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h29qn"] Feb 21 21:49:49 crc kubenswrapper[4717]: I0221 21:49:49.500651 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h29qn" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="registry-server" containerID="cri-o://5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f" gracePeriod=2 Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.007001 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.110953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-catalog-content\") pod \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.111095 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv85l\" (UniqueName: \"kubernetes.io/projected/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-kube-api-access-dv85l\") pod \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.111144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-utilities\") pod \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\" (UID: \"1926a2c9-1123-4ab5-b414-d91e0d56c8f2\") " Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.112259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-utilities" (OuterVolumeSpecName: "utilities") pod "1926a2c9-1123-4ab5-b414-d91e0d56c8f2" (UID: "1926a2c9-1123-4ab5-b414-d91e0d56c8f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.121122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-kube-api-access-dv85l" (OuterVolumeSpecName: "kube-api-access-dv85l") pod "1926a2c9-1123-4ab5-b414-d91e0d56c8f2" (UID: "1926a2c9-1123-4ab5-b414-d91e0d56c8f2"). InnerVolumeSpecName "kube-api-access-dv85l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.160251 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1926a2c9-1123-4ab5-b414-d91e0d56c8f2" (UID: "1926a2c9-1123-4ab5-b414-d91e0d56c8f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.213495 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.213532 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.213544 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv85l\" (UniqueName: \"kubernetes.io/projected/1926a2c9-1123-4ab5-b414-d91e0d56c8f2-kube-api-access-dv85l\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.358367 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h29qn" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.358306 4717 generic.go:334] "Generic (PLEG): container finished" podID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerID="5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f" exitCode=0 Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.358463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerDied","Data":"5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f"} Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.358509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h29qn" event={"ID":"1926a2c9-1123-4ab5-b414-d91e0d56c8f2","Type":"ContainerDied","Data":"8b5c3df7356dc67de506b9469536e47246c0841de14db0692458e10ef030d12c"} Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.358541 4717 scope.go:117] "RemoveContainer" containerID="5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.393990 4717 scope.go:117] "RemoveContainer" containerID="1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.395401 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h29qn"] Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.401116 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h29qn"] Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.423011 4717 scope.go:117] "RemoveContainer" containerID="6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.439358 4717 scope.go:117] "RemoveContainer" containerID="5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f" Feb 21 21:49:50 crc kubenswrapper[4717]: E0221 21:49:50.439916 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f\": container with ID starting with 5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f not found: ID does not exist" containerID="5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.439995 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f"} err="failed to get container status \"5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f\": rpc error: code = NotFound desc = could not find container \"5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f\": container with ID starting with 5d9126724147095f8505c4c4e68938b03d102cfc13178b61ac47fae67a8f152f not found: ID does not exist" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.440045 4717 scope.go:117] "RemoveContainer" containerID="1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76" Feb 21 21:49:50 crc kubenswrapper[4717]: E0221 21:49:50.440464 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76\": container with ID starting with 1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76 not found: ID does not exist" containerID="1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.440544 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76"} err="failed to get container status \"1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76\": rpc error: code = NotFound desc = could not find container \"1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76\": container with ID starting with 1ccbf0b34f9abc9ea3c0b42caef41b580730e87608a851b7b3eacdffaef08f76 not found: ID does not exist" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.440573 4717 scope.go:117] "RemoveContainer" containerID="6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a" Feb 21 21:49:50 crc kubenswrapper[4717]: E0221 21:49:50.440948 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a\": container with ID starting with 6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a not found: ID does not exist" containerID="6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a" Feb 21 21:49:50 crc kubenswrapper[4717]: I0221 21:49:50.440997 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a"} err="failed to get container status \"6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a\": rpc error: code = NotFound desc = could not find container \"6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a\": container with ID starting with 6c4a1e644595310b09ae1524ab95f1e42764bbcc3cac38f176e465e10fda7a1a not found: ID does not exist" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.139978 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" podUID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" containerName="oauth-openshift" containerID="cri-o://7aa4d2d943f741685966146550b2f5de890fd573890da1dc28289ff43ff2689f" gracePeriod=15 Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.371079 4717 generic.go:334] "Generic (PLEG): container finished" podID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" containerID="7aa4d2d943f741685966146550b2f5de890fd573890da1dc28289ff43ff2689f" exitCode=0 Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.371194 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" event={"ID":"5c73e02c-cb77-47e2-bf8a-1092ada428d5","Type":"ContainerDied","Data":"7aa4d2d943f741685966146550b2f5de890fd573890da1dc28289ff43ff2689f"} Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.617137 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.713828 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q54qm"] Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.714151 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q54qm" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="registry-server" containerID="cri-o://de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817" gracePeriod=2 Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.735987 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-policies\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.736060 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-provider-selection\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.736125 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvvs\" (UniqueName: \"kubernetes.io/projected/5c73e02c-cb77-47e2-bf8a-1092ada428d5-kube-api-access-9vvvs\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.736155 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-cliconfig\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.736221 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-router-certs\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737131 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737204 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-serving-cert\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737248 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-idp-0-file-data\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-service-ca\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737336 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-login\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737372 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-dir\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737446 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-error\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737473 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-trusted-ca-bundle\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737499 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-ocp-branding-template\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737557 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-session\") pod \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\" (UID: \"5c73e02c-cb77-47e2-bf8a-1092ada428d5\") " Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737902 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737963 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.737984 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.738475 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.739169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.743663 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.747617 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.747752 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c73e02c-cb77-47e2-bf8a-1092ada428d5-kube-api-access-9vvvs" (OuterVolumeSpecName: "kube-api-access-9vvvs") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "kube-api-access-9vvvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.748251 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.748955 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.749344 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.750895 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.751762 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.752170 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5c73e02c-cb77-47e2-bf8a-1092ada428d5" (UID: "5c73e02c-cb77-47e2-bf8a-1092ada428d5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.839964 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840011 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840034 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840056 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840076 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840097 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vvvs\" (UniqueName: \"kubernetes.io/projected/5c73e02c-cb77-47e2-bf8a-1092ada428d5-kube-api-access-9vvvs\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840116 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840134 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840152 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840170 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840187 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c73e02c-cb77-47e2-bf8a-1092ada428d5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.840206 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c73e02c-cb77-47e2-bf8a-1092ada428d5-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.903292 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrfqw"] Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.904014 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rrfqw" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="registry-server" containerID="cri-o://339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017" gracePeriod=2 Feb 21 21:49:51 crc kubenswrapper[4717]: I0221 21:49:51.993994 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" path="/var/lib/kubelet/pods/1926a2c9-1123-4ab5-b414-d91e0d56c8f2/volumes" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.158613 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.262276 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.347646 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-catalog-content\") pod \"280cdbf0-4686-47a5-a48e-55562853f0f6\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.347699 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6npc\" (UniqueName: \"kubernetes.io/projected/280cdbf0-4686-47a5-a48e-55562853f0f6-kube-api-access-q6npc\") pod \"280cdbf0-4686-47a5-a48e-55562853f0f6\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.347877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-utilities\") pod \"280cdbf0-4686-47a5-a48e-55562853f0f6\" (UID: \"280cdbf0-4686-47a5-a48e-55562853f0f6\") " Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.349078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-utilities" (OuterVolumeSpecName: "utilities") pod "280cdbf0-4686-47a5-a48e-55562853f0f6" (UID: "280cdbf0-4686-47a5-a48e-55562853f0f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.352338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280cdbf0-4686-47a5-a48e-55562853f0f6-kube-api-access-q6npc" (OuterVolumeSpecName: "kube-api-access-q6npc") pod "280cdbf0-4686-47a5-a48e-55562853f0f6" (UID: "280cdbf0-4686-47a5-a48e-55562853f0f6"). InnerVolumeSpecName "kube-api-access-q6npc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.375391 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "280cdbf0-4686-47a5-a48e-55562853f0f6" (UID: "280cdbf0-4686-47a5-a48e-55562853f0f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.379048 4717 generic.go:334] "Generic (PLEG): container finished" podID="7f963242-7453-42b2-83a3-6425582a2b9c" containerID="339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017" exitCode=0 Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.379102 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrfqw" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.379106 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerDied","Data":"339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017"} Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.379154 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrfqw" event={"ID":"7f963242-7453-42b2-83a3-6425582a2b9c","Type":"ContainerDied","Data":"164718ccc9353ee8cf934d19ff954f43029ba5409da67c703eb6d5405b0f7fb8"} Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.379174 4717 scope.go:117] "RemoveContainer" containerID="339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.381781 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" event={"ID":"5c73e02c-cb77-47e2-bf8a-1092ada428d5","Type":"ContainerDied","Data":"2c1aacf0d9100b99eedbff5861f8ed79eb3254b5f9201d39173d1f0fa76715b6"} Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.381909 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-76zmn" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.387799 4717 generic.go:334] "Generic (PLEG): container finished" podID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerID="de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817" exitCode=0 Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.387924 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q54qm" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.387923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q54qm" event={"ID":"280cdbf0-4686-47a5-a48e-55562853f0f6","Type":"ContainerDied","Data":"de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817"} Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.388169 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q54qm" event={"ID":"280cdbf0-4686-47a5-a48e-55562853f0f6","Type":"ContainerDied","Data":"7e8b7594d596c20431c54865ceb53696fab76ccca82852ceac81ec677e8f3952"} Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.408290 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-76zmn"] Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.409952 4717 scope.go:117] "RemoveContainer" containerID="91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.411695 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-76zmn"] Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.431067 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q54qm"] Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.431493 4717 scope.go:117] "RemoveContainer" containerID="517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.434299 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q54qm"] Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.448657 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-catalog-content\") pod \"7f963242-7453-42b2-83a3-6425582a2b9c\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.448765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-utilities\") pod \"7f963242-7453-42b2-83a3-6425582a2b9c\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.448810 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdmml\" (UniqueName: \"kubernetes.io/projected/7f963242-7453-42b2-83a3-6425582a2b9c-kube-api-access-rdmml\") pod \"7f963242-7453-42b2-83a3-6425582a2b9c\" (UID: \"7f963242-7453-42b2-83a3-6425582a2b9c\") " Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.449530 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.449549 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280cdbf0-4686-47a5-a48e-55562853f0f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.449559 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6npc\" (UniqueName: \"kubernetes.io/projected/280cdbf0-4686-47a5-a48e-55562853f0f6-kube-api-access-q6npc\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.449576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-utilities" (OuterVolumeSpecName: "utilities") pod "7f963242-7453-42b2-83a3-6425582a2b9c" (UID: "7f963242-7453-42b2-83a3-6425582a2b9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.455777 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f963242-7453-42b2-83a3-6425582a2b9c-kube-api-access-rdmml" (OuterVolumeSpecName: "kube-api-access-rdmml") pod "7f963242-7453-42b2-83a3-6425582a2b9c" (UID: "7f963242-7453-42b2-83a3-6425582a2b9c"). InnerVolumeSpecName "kube-api-access-rdmml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.456873 4717 scope.go:117] "RemoveContainer" containerID="339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017" Feb 21 21:49:52 crc kubenswrapper[4717]: E0221 21:49:52.457253 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017\": container with ID starting with 339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017 not found: ID does not exist" containerID="339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.457304 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017"} err="failed to get container status \"339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017\": rpc error: code = NotFound desc = could not find container \"339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017\": container with ID starting with 339d1297ab911cedecca6f39c723fd3408e515b0549f01b8ceeaf8028cd4a017 not found: ID does not exist" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.457325 4717 scope.go:117] "RemoveContainer" containerID="91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1" Feb 21 21:49:52 crc kubenswrapper[4717]: E0221 21:49:52.457746 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1\": container with ID starting with 91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1 not found: ID does not exist" containerID="91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.457770 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1"} err="failed to get container status \"91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1\": rpc error: code = NotFound desc = could not find container \"91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1\": container with ID starting with 91d0adb9227dfb56d3a9894aa985430efaf54685e8177db6c77e8df4f4c299c1 not found: ID does not exist" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.457784 4717 scope.go:117] "RemoveContainer" containerID="517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178" Feb 21 21:49:52 crc kubenswrapper[4717]: E0221 21:49:52.458137 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178\": container with ID starting with 517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178 not found: ID does not exist" containerID="517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.458164 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178"} err="failed to get container status \"517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178\": rpc error: code = NotFound desc = could not find container \"517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178\": container with ID starting with 517ece5c1cdcd2d6d7c58b0e4c32d3335e9f3cdacfe1b7110fabf75ae26be178 not found: ID does not exist" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.458178 4717 scope.go:117] "RemoveContainer" containerID="7aa4d2d943f741685966146550b2f5de890fd573890da1dc28289ff43ff2689f" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.475120 4717 scope.go:117] "RemoveContainer" containerID="de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.494768 4717 scope.go:117] "RemoveContainer" containerID="fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.509387 4717 scope.go:117] "RemoveContainer" containerID="7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.523971 4717 scope.go:117] "RemoveContainer" containerID="de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817" Feb 21 21:49:52 crc kubenswrapper[4717]: E0221 21:49:52.524410 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817\": container with ID starting with de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817 not found: ID does not exist" containerID="de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.524449 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817"} err="failed to get container status \"de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817\": rpc error: code = NotFound desc = could not find container \"de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817\": container with ID starting with de0e4b202a4d4805cb4f01f6fb11bea244223364b4850732fbb2060d607f0817 not found: ID does not exist" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.524478 4717 scope.go:117] "RemoveContainer" containerID="fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748" Feb 21 21:49:52 crc kubenswrapper[4717]: E0221 21:49:52.525150 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748\": container with ID starting with fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748 not found: ID does not exist" containerID="fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.525239 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748"} err="failed to get container status \"fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748\": rpc error: code = NotFound desc = could not find container \"fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748\": container with ID starting with fb79cd3c0d0e5c339af81f991233a3b9d44398f0b29cc5963cc0893867b41748 not found: ID does not exist" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.525297 4717 scope.go:117] "RemoveContainer" containerID="7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068" Feb 21 21:49:52 crc kubenswrapper[4717]: E0221 21:49:52.525692 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068\": container with ID starting with 7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068 not found: ID does not exist" containerID="7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.525717 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068"} err="failed to get container status \"7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068\": rpc error: code = NotFound desc = could not find container \"7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068\": container with ID starting with 7bad77e341c5927ff1c2ac1170cc55f40c8075fce6dfb311be9e5a8feee4d068 not found: ID does not exist" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.551556 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.551630 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdmml\" (UniqueName: \"kubernetes.io/projected/7f963242-7453-42b2-83a3-6425582a2b9c-kube-api-access-rdmml\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.632671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f963242-7453-42b2-83a3-6425582a2b9c" (UID: "7f963242-7453-42b2-83a3-6425582a2b9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.654241 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f963242-7453-42b2-83a3-6425582a2b9c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.714101 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrfqw"] Feb 21 21:49:52 crc kubenswrapper[4717]: I0221 21:49:52.733280 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rrfqw"] Feb 21 21:49:53 crc kubenswrapper[4717]: I0221 21:49:53.984724 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" path="/var/lib/kubelet/pods/280cdbf0-4686-47a5-a48e-55562853f0f6/volumes" Feb 21 21:49:53 crc kubenswrapper[4717]: I0221 21:49:53.986084 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" path="/var/lib/kubelet/pods/5c73e02c-cb77-47e2-bf8a-1092ada428d5/volumes" Feb 21 21:49:53 crc kubenswrapper[4717]: I0221 21:49:53.986939 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" path="/var/lib/kubelet/pods/7f963242-7453-42b2-83a3-6425582a2b9c/volumes" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.741224 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg"] Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.741974 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="extract-utilities" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.741988 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="extract-utilities" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.741996 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="extract-utilities" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742001 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="extract-utilities" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742012 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="extract-content" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742018 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="extract-content" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742028 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742035 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742044 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="extract-content" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742049 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="extract-content" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742061 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="extract-utilities" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742066 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="extract-utilities" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742074 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="extract-content" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742080 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="extract-content" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742088 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742093 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742102 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" containerName="oauth-openshift" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742108 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" containerName="oauth-openshift" Feb 21 21:49:58 crc kubenswrapper[4717]: E0221 21:49:58.742116 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742121 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742215 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="280cdbf0-4686-47a5-a48e-55562853f0f6" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742237 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c73e02c-cb77-47e2-bf8a-1092ada428d5" containerName="oauth-openshift" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742248 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f963242-7453-42b2-83a3-6425582a2b9c" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742257 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1926a2c9-1123-4ab5-b414-d91e0d56c8f2" containerName="registry-server" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.742886 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.752616 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.752954 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753006 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753033 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753110 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753208 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753275 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753434 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753740 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.753963 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.754405 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.754431 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.765060 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.769446 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.773333 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.775355 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg"] Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.847477 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-audit-policies\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.847543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-session\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.847576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-router-certs\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.847613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.847913 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzzw7\" (UniqueName: \"kubernetes.io/projected/a23dea1c-3787-4b5c-8096-3737561650ab-kube-api-access-xzzw7\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.847989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a23dea1c-3787-4b5c-8096-3737561650ab-audit-dir\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848140 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-service-ca\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848456 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848535 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-login\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848565 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-error\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848658 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848756 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.848809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.951109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-service-ca\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.951159 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.951219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-login\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.951238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-error\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952356 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952796 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-audit-policies\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952817 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-session\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952837 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-router-certs\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952879 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzzw7\" (UniqueName: \"kubernetes.io/projected/a23dea1c-3787-4b5c-8096-3737561650ab-kube-api-access-xzzw7\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.952940 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a23dea1c-3787-4b5c-8096-3737561650ab-audit-dir\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.953008 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a23dea1c-3787-4b5c-8096-3737561650ab-audit-dir\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.953385 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-service-ca\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.953596 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-audit-policies\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.954644 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.958339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.958440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-router-certs\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.958559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-error\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.959362 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.959523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.959848 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-system-session\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.962688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-login\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.968165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a23dea1c-3787-4b5c-8096-3737561650ab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:58 crc kubenswrapper[4717]: I0221 21:49:58.973932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzzw7\" (UniqueName: \"kubernetes.io/projected/a23dea1c-3787-4b5c-8096-3737561650ab-kube-api-access-xzzw7\") pod \"oauth-openshift-d8dcdf7ff-fj7zg\" (UID: \"a23dea1c-3787-4b5c-8096-3737561650ab\") " pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:59 crc kubenswrapper[4717]: I0221 21:49:59.066195 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:49:59 crc kubenswrapper[4717]: I0221 21:49:59.544485 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg"] Feb 21 21:49:59 crc kubenswrapper[4717]: W0221 21:49:59.555253 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda23dea1c_3787_4b5c_8096_3737561650ab.slice/crio-a7a520974cfb3b64ee2795c86934ffe5a1359607d032847b6b3a169e1bfe20e8 WatchSource:0}: Error finding container a7a520974cfb3b64ee2795c86934ffe5a1359607d032847b6b3a169e1bfe20e8: Status 404 returned error can't find the container with id a7a520974cfb3b64ee2795c86934ffe5a1359607d032847b6b3a169e1bfe20e8 Feb 21 21:50:00 crc kubenswrapper[4717]: I0221 21:50:00.464008 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" event={"ID":"a23dea1c-3787-4b5c-8096-3737561650ab","Type":"ContainerStarted","Data":"75e3f7cd20277b75e88a8084d67fd1d62a566dc35d328b3c73d11c344a1aa4b6"} Feb 21 21:50:00 crc kubenswrapper[4717]: I0221 21:50:00.464062 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" event={"ID":"a23dea1c-3787-4b5c-8096-3737561650ab","Type":"ContainerStarted","Data":"a7a520974cfb3b64ee2795c86934ffe5a1359607d032847b6b3a169e1bfe20e8"} Feb 21 21:50:00 crc kubenswrapper[4717]: I0221 21:50:00.464351 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:50:00 crc kubenswrapper[4717]: I0221 21:50:00.471040 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" Feb 21 21:50:00 crc kubenswrapper[4717]: I0221 21:50:00.490428 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-d8dcdf7ff-fj7zg" podStartSLOduration=34.490414432 podStartE2EDuration="34.490414432s" podCreationTimestamp="2026-02-21 21:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:50:00.488184582 +0000 UTC m=+215.269718244" watchObservedRunningTime="2026-02-21 21:50:00.490414432 +0000 UTC m=+215.271948064" Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.063321 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.063817 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.063919 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.064674 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.064743 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b" gracePeriod=600 Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.521816 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b" exitCode=0 Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.521899 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b"} Feb 21 21:50:09 crc kubenswrapper[4717]: I0221 21:50:09.522301 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"f1592279668470423b045cbb5b5e5ff0c27879fab6c9b2573e402c21a013af59"} Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.703727 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7bn5"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.705803 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7bn5" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="registry-server" containerID="cri-o://fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84" gracePeriod=30 Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.739512 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9kmrb"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.746656 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8q2r4"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.750482 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9kmrb" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="registry-server" containerID="cri-o://bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b" gracePeriod=30 Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.754577 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" podUID="60d25c82-47d6-4706-8235-70fd592a984d" containerName="marketplace-operator" containerID="cri-o://876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20" gracePeriod=30 Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.788243 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m7gj"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.788477 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6m7gj" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="registry-server" containerID="cri-o://422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" gracePeriod=30 Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.795615 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v7tr6"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.795962 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v7tr6" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="registry-server" containerID="cri-o://0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6" gracePeriod=30 Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.809387 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kd64g"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.810322 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.830796 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kd64g"] Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.917157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.917201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:16 crc kubenswrapper[4717]: I0221 21:50:16.917323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv468\" (UniqueName: \"kubernetes.io/projected/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-kube-api-access-bv468\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.003276 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637 is running failed: container process not found" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.003664 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637 is running failed: container process not found" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.004085 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637 is running failed: container process not found" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.004116 4717 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6m7gj" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="registry-server" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.018411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.018463 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.018502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv468\" (UniqueName: \"kubernetes.io/projected/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-kube-api-access-bv468\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.024537 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.024837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.040591 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv468\" (UniqueName: \"kubernetes.io/projected/5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d-kube-api-access-bv468\") pod \"marketplace-operator-79b997595-kd64g\" (UID: \"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d\") " pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.117821 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.203288 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.220907 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwqx9\" (UniqueName: \"kubernetes.io/projected/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-kube-api-access-nwqx9\") pod \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.220948 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-utilities\") pod \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.221025 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-catalog-content\") pod \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\" (UID: \"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.226336 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-utilities" (OuterVolumeSpecName: "utilities") pod "a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" (UID: "a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.232107 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-kube-api-access-nwqx9" (OuterVolumeSpecName: "kube-api-access-nwqx9") pod "a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" (UID: "a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb"). InnerVolumeSpecName "kube-api-access-nwqx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.240408 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.257962 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.296680 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.308829 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" (UID: "a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.318371 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.321831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-catalog-content\") pod \"80125217-20f6-4337-8be2-8874b40aa10e\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.321926 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dh5k\" (UniqueName: \"kubernetes.io/projected/80125217-20f6-4337-8be2-8874b40aa10e-kube-api-access-5dh5k\") pod \"80125217-20f6-4337-8be2-8874b40aa10e\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.321953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-utilities\") pod \"80125217-20f6-4337-8be2-8874b40aa10e\" (UID: \"80125217-20f6-4337-8be2-8874b40aa10e\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.321971 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-catalog-content\") pod \"914ea582-116d-4c4a-9d8d-34fda8fb5323\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322014 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjwwh\" (UniqueName: \"kubernetes.io/projected/914ea582-116d-4c4a-9d8d-34fda8fb5323-kube-api-access-gjwwh\") pod \"914ea582-116d-4c4a-9d8d-34fda8fb5323\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322052 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-utilities\") pod \"914ea582-116d-4c4a-9d8d-34fda8fb5323\" (UID: \"914ea582-116d-4c4a-9d8d-34fda8fb5323\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6h28\" (UniqueName: \"kubernetes.io/projected/60d25c82-47d6-4706-8235-70fd592a984d-kube-api-access-p6h28\") pod \"60d25c82-47d6-4706-8235-70fd592a984d\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322100 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-operator-metrics\") pod \"60d25c82-47d6-4706-8235-70fd592a984d\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322149 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-trusted-ca\") pod \"60d25c82-47d6-4706-8235-70fd592a984d\" (UID: \"60d25c82-47d6-4706-8235-70fd592a984d\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322347 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwqx9\" (UniqueName: \"kubernetes.io/projected/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-kube-api-access-nwqx9\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322364 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.322373 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.323024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "60d25c82-47d6-4706-8235-70fd592a984d" (UID: "60d25c82-47d6-4706-8235-70fd592a984d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.324997 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-utilities" (OuterVolumeSpecName: "utilities") pod "914ea582-116d-4c4a-9d8d-34fda8fb5323" (UID: "914ea582-116d-4c4a-9d8d-34fda8fb5323"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.327375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-utilities" (OuterVolumeSpecName: "utilities") pod "80125217-20f6-4337-8be2-8874b40aa10e" (UID: "80125217-20f6-4337-8be2-8874b40aa10e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.327874 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914ea582-116d-4c4a-9d8d-34fda8fb5323-kube-api-access-gjwwh" (OuterVolumeSpecName: "kube-api-access-gjwwh") pod "914ea582-116d-4c4a-9d8d-34fda8fb5323" (UID: "914ea582-116d-4c4a-9d8d-34fda8fb5323"). InnerVolumeSpecName "kube-api-access-gjwwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.328332 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "60d25c82-47d6-4706-8235-70fd592a984d" (UID: "60d25c82-47d6-4706-8235-70fd592a984d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.329542 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80125217-20f6-4337-8be2-8874b40aa10e-kube-api-access-5dh5k" (OuterVolumeSpecName: "kube-api-access-5dh5k") pod "80125217-20f6-4337-8be2-8874b40aa10e" (UID: "80125217-20f6-4337-8be2-8874b40aa10e"). InnerVolumeSpecName "kube-api-access-5dh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.331809 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d25c82-47d6-4706-8235-70fd592a984d-kube-api-access-p6h28" (OuterVolumeSpecName: "kube-api-access-p6h28") pod "60d25c82-47d6-4706-8235-70fd592a984d" (UID: "60d25c82-47d6-4706-8235-70fd592a984d"). InnerVolumeSpecName "kube-api-access-p6h28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.386063 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "914ea582-116d-4c4a-9d8d-34fda8fb5323" (UID: "914ea582-116d-4c4a-9d8d-34fda8fb5323"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.393936 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kd64g"] Feb 21 21:50:17 crc kubenswrapper[4717]: W0221 21:50:17.402048 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd8c724_ea15_4c93_b15b_cfe5b39d9c1d.slice/crio-b7bebb689e6b8eb659bb4995a835e4018d6f8ccbdc41039ff8e45696ac2b7bd9 WatchSource:0}: Error finding container b7bebb689e6b8eb659bb4995a835e4018d6f8ccbdc41039ff8e45696ac2b7bd9: Status 404 returned error can't find the container with id b7bebb689e6b8eb659bb4995a835e4018d6f8ccbdc41039ff8e45696ac2b7bd9 Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.416241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80125217-20f6-4337-8be2-8874b40aa10e" (UID: "80125217-20f6-4337-8be2-8874b40aa10e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-utilities\") pod \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423285 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvwr\" (UniqueName: \"kubernetes.io/projected/ea0c5c67-a77c-463b-8339-a73a7a9605e1-kube-api-access-lxvwr\") pod \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423333 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-catalog-content\") pod \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\" (UID: \"ea0c5c67-a77c-463b-8339-a73a7a9605e1\") " Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423591 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423638 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dh5k\" (UniqueName: \"kubernetes.io/projected/80125217-20f6-4337-8be2-8874b40aa10e-kube-api-access-5dh5k\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423652 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80125217-20f6-4337-8be2-8874b40aa10e-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423661 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423670 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjwwh\" (UniqueName: \"kubernetes.io/projected/914ea582-116d-4c4a-9d8d-34fda8fb5323-kube-api-access-gjwwh\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423679 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/914ea582-116d-4c4a-9d8d-34fda8fb5323-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423720 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6h28\" (UniqueName: \"kubernetes.io/projected/60d25c82-47d6-4706-8235-70fd592a984d-kube-api-access-p6h28\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423732 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.423744 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d25c82-47d6-4706-8235-70fd592a984d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.424029 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-utilities" (OuterVolumeSpecName: "utilities") pod "ea0c5c67-a77c-463b-8339-a73a7a9605e1" (UID: "ea0c5c67-a77c-463b-8339-a73a7a9605e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.440219 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0c5c67-a77c-463b-8339-a73a7a9605e1-kube-api-access-lxvwr" (OuterVolumeSpecName: "kube-api-access-lxvwr") pod "ea0c5c67-a77c-463b-8339-a73a7a9605e1" (UID: "ea0c5c67-a77c-463b-8339-a73a7a9605e1"). InnerVolumeSpecName "kube-api-access-lxvwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.525426 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.525818 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvwr\" (UniqueName: \"kubernetes.io/projected/ea0c5c67-a77c-463b-8339-a73a7a9605e1-kube-api-access-lxvwr\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.549351 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea0c5c67-a77c-463b-8339-a73a7a9605e1" (UID: "ea0c5c67-a77c-463b-8339-a73a7a9605e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.560279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" event={"ID":"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d","Type":"ContainerStarted","Data":"b7bebb689e6b8eb659bb4995a835e4018d6f8ccbdc41039ff8e45696ac2b7bd9"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.561480 4717 generic.go:334] "Generic (PLEG): container finished" podID="60d25c82-47d6-4706-8235-70fd592a984d" containerID="876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20" exitCode=0 Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.561534 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.561537 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" event={"ID":"60d25c82-47d6-4706-8235-70fd592a984d","Type":"ContainerDied","Data":"876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.561579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8q2r4" event={"ID":"60d25c82-47d6-4706-8235-70fd592a984d","Type":"ContainerDied","Data":"cdff9510ac1d6bce743f1b01fe034ceb14fed18f7fba25c9e42e698a3b1c3fe6"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.561593 4717 scope.go:117] "RemoveContainer" containerID="876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.564966 4717 generic.go:334] "Generic (PLEG): container finished" podID="80125217-20f6-4337-8be2-8874b40aa10e" containerID="bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b" exitCode=0 Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.565013 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerDied","Data":"bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.565031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kmrb" event={"ID":"80125217-20f6-4337-8be2-8874b40aa10e","Type":"ContainerDied","Data":"f5ba6839a4c9898a0a6a2be1ce783e056febf09b8c3bf751dee6441d4c844aea"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.565094 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kmrb" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.572740 4717 generic.go:334] "Generic (PLEG): container finished" podID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerID="fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84" exitCode=0 Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.572786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerDied","Data":"fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.572802 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7bn5" event={"ID":"a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb","Type":"ContainerDied","Data":"083272ec1b004be045eb016dee7186ea2ea7fbd58df2af6b09d087414cedccee"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.572887 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7bn5" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.575371 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerID="0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6" exitCode=0 Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.575408 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerDied","Data":"0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.575463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7tr6" event={"ID":"ea0c5c67-a77c-463b-8339-a73a7a9605e1","Type":"ContainerDied","Data":"04add27162731e928224a60d16d864824f02733524042a567bb0ab9a56cbdcec"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.575510 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7tr6" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.579006 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m7gj" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.579146 4717 generic.go:334] "Generic (PLEG): container finished" podID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" exitCode=0 Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.579391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m7gj" event={"ID":"914ea582-116d-4c4a-9d8d-34fda8fb5323","Type":"ContainerDied","Data":"422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637"} Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.579679 4717 scope.go:117] "RemoveContainer" containerID="876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.579919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m7gj" event={"ID":"914ea582-116d-4c4a-9d8d-34fda8fb5323","Type":"ContainerDied","Data":"752aae47e11cc703baa30e6fc72dfa8a5e246bfca22d8b789f1c5e86fdf65887"} Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.580137 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20\": container with ID starting with 876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20 not found: ID does not exist" containerID="876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.580166 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20"} err="failed to get container status \"876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20\": rpc error: code = NotFound desc = could not find container \"876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20\": container with ID starting with 876c324a12fc91ef3801bbaa01debf7a01dcdb1665fde5c01429e6cd393f2d20 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.580186 4717 scope.go:117] "RemoveContainer" containerID="bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.605424 4717 scope.go:117] "RemoveContainer" containerID="52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.613070 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9kmrb"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.616664 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9kmrb"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.624040 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7bn5"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.626378 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7bn5"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.626466 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0c5c67-a77c-463b-8339-a73a7a9605e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.632316 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8q2r4"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.639700 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8q2r4"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.642753 4717 scope.go:117] "RemoveContainer" containerID="fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.642885 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m7gj"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.648363 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m7gj"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.655048 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v7tr6"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.659773 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v7tr6"] Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.663595 4717 scope.go:117] "RemoveContainer" containerID="bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.664081 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b\": container with ID starting with bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b not found: ID does not exist" containerID="bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.664118 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b"} err="failed to get container status \"bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b\": rpc error: code = NotFound desc = could not find container \"bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b\": container with ID starting with bbc7bd6ca0b665644cf04c0493ead4ea21ff5daac216e12cea64f2e017217f1b not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.664143 4717 scope.go:117] "RemoveContainer" containerID="52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.664442 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6\": container with ID starting with 52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6 not found: ID does not exist" containerID="52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.664467 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6"} err="failed to get container status \"52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6\": rpc error: code = NotFound desc = could not find container \"52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6\": container with ID starting with 52d802740022168e1a06e03f669be100ced4e5b8b1c7c439dc6f0b5721857bd6 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.664483 4717 scope.go:117] "RemoveContainer" containerID="fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.664728 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64\": container with ID starting with fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64 not found: ID does not exist" containerID="fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.664756 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64"} err="failed to get container status \"fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64\": rpc error: code = NotFound desc = could not find container \"fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64\": container with ID starting with fc46b171a05c4b63466c71629b014ec86f17639e3337512422e70a105342cd64 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.664781 4717 scope.go:117] "RemoveContainer" containerID="fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.680616 4717 scope.go:117] "RemoveContainer" containerID="22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.694502 4717 scope.go:117] "RemoveContainer" containerID="12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.708607 4717 scope.go:117] "RemoveContainer" containerID="fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.709090 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84\": container with ID starting with fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84 not found: ID does not exist" containerID="fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.709131 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84"} err="failed to get container status \"fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84\": rpc error: code = NotFound desc = could not find container \"fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84\": container with ID starting with fb6d98df8fcd184f9195604c5ac913bae3afdaad8b4c2090c130c22460471f84 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.709167 4717 scope.go:117] "RemoveContainer" containerID="22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.709471 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2\": container with ID starting with 22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2 not found: ID does not exist" containerID="22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.709512 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2"} err="failed to get container status \"22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2\": rpc error: code = NotFound desc = could not find container \"22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2\": container with ID starting with 22c24f097aab5059c6b45b8ea818853073ab35f87b414f27305926ddeba387d2 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.709540 4717 scope.go:117] "RemoveContainer" containerID="12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.709844 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4\": container with ID starting with 12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4 not found: ID does not exist" containerID="12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.709889 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4"} err="failed to get container status \"12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4\": rpc error: code = NotFound desc = could not find container \"12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4\": container with ID starting with 12fe8bcf7a8c80ed6bfadc882a7fb3d8c908a71e34b11a891e8651aa01d5e3e4 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.709908 4717 scope.go:117] "RemoveContainer" containerID="0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.723627 4717 scope.go:117] "RemoveContainer" containerID="350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.760214 4717 scope.go:117] "RemoveContainer" containerID="dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.787325 4717 scope.go:117] "RemoveContainer" containerID="0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.788105 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6\": container with ID starting with 0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6 not found: ID does not exist" containerID="0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.788151 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6"} err="failed to get container status \"0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6\": rpc error: code = NotFound desc = could not find container \"0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6\": container with ID starting with 0ab6a207a2621190198631ee463a84465bbea80fe3ef38bff146ea14403f77c6 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.788183 4717 scope.go:117] "RemoveContainer" containerID="350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.788702 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d\": container with ID starting with 350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d not found: ID does not exist" containerID="350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.788823 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d"} err="failed to get container status \"350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d\": rpc error: code = NotFound desc = could not find container \"350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d\": container with ID starting with 350483db840ae6036239485e8c1c74510281418feb88611e0e06f35f439ea10d not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.788961 4717 scope.go:117] "RemoveContainer" containerID="dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.789443 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123\": container with ID starting with dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123 not found: ID does not exist" containerID="dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.789487 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123"} err="failed to get container status \"dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123\": rpc error: code = NotFound desc = could not find container \"dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123\": container with ID starting with dce0b5f83153604bef5832ccb022565eb7517b24ebba54630973c986e5725123 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.789514 4717 scope.go:117] "RemoveContainer" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.806405 4717 scope.go:117] "RemoveContainer" containerID="9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.818425 4717 scope.go:117] "RemoveContainer" containerID="44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.831330 4717 scope.go:117] "RemoveContainer" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.831821 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637\": container with ID starting with 422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637 not found: ID does not exist" containerID="422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.831880 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637"} err="failed to get container status \"422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637\": rpc error: code = NotFound desc = could not find container \"422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637\": container with ID starting with 422f2ff7ad999c61b4c696854a6c84ff4a278472f1c6ceba8da93c7c820b4637 not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.831908 4717 scope.go:117] "RemoveContainer" containerID="9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.832576 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc\": container with ID starting with 9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc not found: ID does not exist" containerID="9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.832633 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc"} err="failed to get container status \"9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc\": rpc error: code = NotFound desc = could not find container \"9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc\": container with ID starting with 9af7938113f33996040e3d4a5142246e38c533b2998e3d9b073b15ab2ed376fc not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.832667 4717 scope.go:117] "RemoveContainer" containerID="44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf" Feb 21 21:50:17 crc kubenswrapper[4717]: E0221 21:50:17.833617 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf\": container with ID starting with 44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf not found: ID does not exist" containerID="44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.833650 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf"} err="failed to get container status \"44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf\": rpc error: code = NotFound desc = could not find container \"44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf\": container with ID starting with 44fa110c2279e50108c55df2b05c36c82825d4b791a6c36ea3503d948ac3ddbf not found: ID does not exist" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.983563 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d25c82-47d6-4706-8235-70fd592a984d" path="/var/lib/kubelet/pods/60d25c82-47d6-4706-8235-70fd592a984d/volumes" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.984555 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80125217-20f6-4337-8be2-8874b40aa10e" path="/var/lib/kubelet/pods/80125217-20f6-4337-8be2-8874b40aa10e/volumes" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.985153 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" path="/var/lib/kubelet/pods/914ea582-116d-4c4a-9d8d-34fda8fb5323/volumes" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.988078 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" path="/var/lib/kubelet/pods/a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb/volumes" Feb 21 21:50:17 crc kubenswrapper[4717]: I0221 21:50:17.995901 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" path="/var/lib/kubelet/pods/ea0c5c67-a77c-463b-8339-a73a7a9605e1/volumes" Feb 21 21:50:18 crc kubenswrapper[4717]: I0221 21:50:18.591952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" event={"ID":"5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d","Type":"ContainerStarted","Data":"f6ef9e4569b62ed776113b82b7db3668c0a15e8deee637dd2a7b2526914c6cad"} Feb 21 21:50:18 crc kubenswrapper[4717]: I0221 21:50:18.592165 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:18 crc kubenswrapper[4717]: I0221 21:50:18.601952 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" Feb 21 21:50:18 crc kubenswrapper[4717]: I0221 21:50:18.606799 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kd64g" podStartSLOduration=2.606785341 podStartE2EDuration="2.606785341s" podCreationTimestamp="2026-02-21 21:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:50:18.605376649 +0000 UTC m=+233.386910271" watchObservedRunningTime="2026-02-21 21:50:18.606785341 +0000 UTC m=+233.388318963" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513335 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhgrt"] Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513816 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513828 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513838 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513843 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513853 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513903 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513914 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513920 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513927 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513934 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513944 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d25c82-47d6-4706-8235-70fd592a984d" containerName="marketplace-operator" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513949 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d25c82-47d6-4706-8235-70fd592a984d" containerName="marketplace-operator" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513956 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513961 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513968 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513974 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513982 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.513988 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.513995 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514000 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.514009 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514014 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.514023 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514029 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="extract-utilities" Feb 21 21:50:19 crc kubenswrapper[4717]: E0221 21:50:19.514036 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514042 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="extract-content" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514122 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12ce2bb-0b6a-46d2-a9f3-bac10126e9cb" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514135 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d25c82-47d6-4706-8235-70fd592a984d" containerName="marketplace-operator" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514145 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="80125217-20f6-4337-8be2-8874b40aa10e" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514153 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0c5c67-a77c-463b-8339-a73a7a9605e1" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514165 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="914ea582-116d-4c4a-9d8d-34fda8fb5323" containerName="registry-server" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.514840 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.516769 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.524123 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhgrt"] Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.558640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8ws\" (UniqueName: \"kubernetes.io/projected/57258517-86da-432f-8123-e3af4325d01a-kube-api-access-sw8ws\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.558785 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57258517-86da-432f-8123-e3af4325d01a-catalog-content\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.559060 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57258517-86da-432f-8123-e3af4325d01a-utilities\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.660023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57258517-86da-432f-8123-e3af4325d01a-catalog-content\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.660160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57258517-86da-432f-8123-e3af4325d01a-utilities\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.660200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8ws\" (UniqueName: \"kubernetes.io/projected/57258517-86da-432f-8123-e3af4325d01a-kube-api-access-sw8ws\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.661474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57258517-86da-432f-8123-e3af4325d01a-catalog-content\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.661595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57258517-86da-432f-8123-e3af4325d01a-utilities\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.682092 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8ws\" (UniqueName: \"kubernetes.io/projected/57258517-86da-432f-8123-e3af4325d01a-kube-api-access-sw8ws\") pod \"redhat-marketplace-dhgrt\" (UID: \"57258517-86da-432f-8123-e3af4325d01a\") " pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.713436 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbvfd"] Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.714583 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.717048 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.722350 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbvfd"] Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.760897 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f581da-ba2e-469c-b3eb-745f5b14190e-utilities\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.760957 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rgx\" (UniqueName: \"kubernetes.io/projected/38f581da-ba2e-469c-b3eb-745f5b14190e-kube-api-access-j2rgx\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.760989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f581da-ba2e-469c-b3eb-745f5b14190e-catalog-content\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.841724 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.865338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f581da-ba2e-469c-b3eb-745f5b14190e-catalog-content\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.865537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f581da-ba2e-469c-b3eb-745f5b14190e-utilities\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.865605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rgx\" (UniqueName: \"kubernetes.io/projected/38f581da-ba2e-469c-b3eb-745f5b14190e-kube-api-access-j2rgx\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.866206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f581da-ba2e-469c-b3eb-745f5b14190e-catalog-content\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.866279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f581da-ba2e-469c-b3eb-745f5b14190e-utilities\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:19 crc kubenswrapper[4717]: I0221 21:50:19.887825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rgx\" (UniqueName: \"kubernetes.io/projected/38f581da-ba2e-469c-b3eb-745f5b14190e-kube-api-access-j2rgx\") pod \"redhat-operators-pbvfd\" (UID: \"38f581da-ba2e-469c-b3eb-745f5b14190e\") " pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.036943 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.256279 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhgrt"] Feb 21 21:50:20 crc kubenswrapper[4717]: W0221 21:50:20.261029 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57258517_86da_432f_8123_e3af4325d01a.slice/crio-93419485635f18f97c9bb1ff01ff53b7331fd046f79e3144aab3b479e6e809b3 WatchSource:0}: Error finding container 93419485635f18f97c9bb1ff01ff53b7331fd046f79e3144aab3b479e6e809b3: Status 404 returned error can't find the container with id 93419485635f18f97c9bb1ff01ff53b7331fd046f79e3144aab3b479e6e809b3 Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.418208 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbvfd"] Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.608501 4717 generic.go:334] "Generic (PLEG): container finished" podID="57258517-86da-432f-8123-e3af4325d01a" containerID="5b391a9ef1fb7ad16dbea5c7e8da015d9c09af7f65266ce8864a61b687da7e84" exitCode=0 Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.608588 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhgrt" event={"ID":"57258517-86da-432f-8123-e3af4325d01a","Type":"ContainerDied","Data":"5b391a9ef1fb7ad16dbea5c7e8da015d9c09af7f65266ce8864a61b687da7e84"} Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.608619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhgrt" event={"ID":"57258517-86da-432f-8123-e3af4325d01a","Type":"ContainerStarted","Data":"93419485635f18f97c9bb1ff01ff53b7331fd046f79e3144aab3b479e6e809b3"} Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.610357 4717 generic.go:334] "Generic (PLEG): container finished" podID="38f581da-ba2e-469c-b3eb-745f5b14190e" containerID="a1690aa8e0afdbfeb49477b7936fdab599d5b89fe58c4ec338ce85302740e89a" exitCode=0 Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.611279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbvfd" event={"ID":"38f581da-ba2e-469c-b3eb-745f5b14190e","Type":"ContainerDied","Data":"a1690aa8e0afdbfeb49477b7936fdab599d5b89fe58c4ec338ce85302740e89a"} Feb 21 21:50:20 crc kubenswrapper[4717]: I0221 21:50:20.611324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbvfd" event={"ID":"38f581da-ba2e-469c-b3eb-745f5b14190e","Type":"ContainerStarted","Data":"85900786972d9d51791c1ee1b987a87f6ee039156cbaaa97cee72be6abcbc26d"} Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.909275 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6d6h2"] Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.910775 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.913152 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.933779 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6d6h2"] Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.994148 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf74n\" (UniqueName: \"kubernetes.io/projected/faf0d743-c229-4c65-b34e-8220f1bb1cc1-kube-api-access-qf74n\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.994264 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faf0d743-c229-4c65-b34e-8220f1bb1cc1-utilities\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:21 crc kubenswrapper[4717]: I0221 21:50:21.994344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faf0d743-c229-4c65-b34e-8220f1bb1cc1-catalog-content\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.098223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf74n\" (UniqueName: \"kubernetes.io/projected/faf0d743-c229-4c65-b34e-8220f1bb1cc1-kube-api-access-qf74n\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.099004 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faf0d743-c229-4c65-b34e-8220f1bb1cc1-utilities\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.099044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faf0d743-c229-4c65-b34e-8220f1bb1cc1-catalog-content\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.099603 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faf0d743-c229-4c65-b34e-8220f1bb1cc1-catalog-content\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.099850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faf0d743-c229-4c65-b34e-8220f1bb1cc1-utilities\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.112142 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dljqt"] Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.113218 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.115618 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.125133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf74n\" (UniqueName: \"kubernetes.io/projected/faf0d743-c229-4c65-b34e-8220f1bb1cc1-kube-api-access-qf74n\") pod \"certified-operators-6d6h2\" (UID: \"faf0d743-c229-4c65-b34e-8220f1bb1cc1\") " pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.128428 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dljqt"] Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.200244 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175e04b6-74b9-44e7-94ce-950ffeb16677-catalog-content\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.200286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmx59\" (UniqueName: \"kubernetes.io/projected/175e04b6-74b9-44e7-94ce-950ffeb16677-kube-api-access-mmx59\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.200331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175e04b6-74b9-44e7-94ce-950ffeb16677-utilities\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.227472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.301848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175e04b6-74b9-44e7-94ce-950ffeb16677-catalog-content\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.302177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmx59\" (UniqueName: \"kubernetes.io/projected/175e04b6-74b9-44e7-94ce-950ffeb16677-kube-api-access-mmx59\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.302338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175e04b6-74b9-44e7-94ce-950ffeb16677-utilities\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.302421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175e04b6-74b9-44e7-94ce-950ffeb16677-catalog-content\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.302703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175e04b6-74b9-44e7-94ce-950ffeb16677-utilities\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.324962 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmx59\" (UniqueName: \"kubernetes.io/projected/175e04b6-74b9-44e7-94ce-950ffeb16677-kube-api-access-mmx59\") pod \"community-operators-dljqt\" (UID: \"175e04b6-74b9-44e7-94ce-950ffeb16677\") " pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.448464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.467420 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6d6h2"] Feb 21 21:50:22 crc kubenswrapper[4717]: W0221 21:50:22.495423 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf0d743_c229_4c65_b34e_8220f1bb1cc1.slice/crio-076e8ed164c619630f2287334d588ee6b644cad34b218db0a6ea77929be8ab8e WatchSource:0}: Error finding container 076e8ed164c619630f2287334d588ee6b644cad34b218db0a6ea77929be8ab8e: Status 404 returned error can't find the container with id 076e8ed164c619630f2287334d588ee6b644cad34b218db0a6ea77929be8ab8e Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.626414 4717 generic.go:334] "Generic (PLEG): container finished" podID="38f581da-ba2e-469c-b3eb-745f5b14190e" containerID="f867ba3a12c8c5e17e12ab5e105ec53055fdb6c8f823007903a31bd69f9f9c9f" exitCode=0 Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.626468 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbvfd" event={"ID":"38f581da-ba2e-469c-b3eb-745f5b14190e","Type":"ContainerDied","Data":"f867ba3a12c8c5e17e12ab5e105ec53055fdb6c8f823007903a31bd69f9f9c9f"} Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.630059 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6d6h2" event={"ID":"faf0d743-c229-4c65-b34e-8220f1bb1cc1","Type":"ContainerStarted","Data":"bacde4491a4e016225eb8fedbfa0a750f9361a8732c90845843bea3c1126479d"} Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.630091 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6d6h2" event={"ID":"faf0d743-c229-4c65-b34e-8220f1bb1cc1","Type":"ContainerStarted","Data":"076e8ed164c619630f2287334d588ee6b644cad34b218db0a6ea77929be8ab8e"} Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.641196 4717 generic.go:334] "Generic (PLEG): container finished" podID="57258517-86da-432f-8123-e3af4325d01a" containerID="3a4af79cd10acca44b1d05c96dd3aad42d5e845651bb1c78d34563f3f9add815" exitCode=0 Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.641248 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhgrt" event={"ID":"57258517-86da-432f-8123-e3af4325d01a","Type":"ContainerDied","Data":"3a4af79cd10acca44b1d05c96dd3aad42d5e845651bb1c78d34563f3f9add815"} Feb 21 21:50:22 crc kubenswrapper[4717]: I0221 21:50:22.658730 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dljqt"] Feb 21 21:50:22 crc kubenswrapper[4717]: W0221 21:50:22.673717 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175e04b6_74b9_44e7_94ce_950ffeb16677.slice/crio-64897103e073d2ddd9e1f15c64b318e525adc37016c637da839a98a3c98f191c WatchSource:0}: Error finding container 64897103e073d2ddd9e1f15c64b318e525adc37016c637da839a98a3c98f191c: Status 404 returned error can't find the container with id 64897103e073d2ddd9e1f15c64b318e525adc37016c637da839a98a3c98f191c Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.660780 4717 generic.go:334] "Generic (PLEG): container finished" podID="175e04b6-74b9-44e7-94ce-950ffeb16677" containerID="b23a6bb27fd351e00b7931687123fefbd06ae9e5bbb0edf961b390a5d59eb692" exitCode=0 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.660830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dljqt" event={"ID":"175e04b6-74b9-44e7-94ce-950ffeb16677","Type":"ContainerDied","Data":"b23a6bb27fd351e00b7931687123fefbd06ae9e5bbb0edf961b390a5d59eb692"} Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.661178 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dljqt" event={"ID":"175e04b6-74b9-44e7-94ce-950ffeb16677","Type":"ContainerStarted","Data":"64897103e073d2ddd9e1f15c64b318e525adc37016c637da839a98a3c98f191c"} Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.666181 4717 generic.go:334] "Generic (PLEG): container finished" podID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" containerID="bacde4491a4e016225eb8fedbfa0a750f9361a8732c90845843bea3c1126479d" exitCode=0 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.666244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6d6h2" event={"ID":"faf0d743-c229-4c65-b34e-8220f1bb1cc1","Type":"ContainerDied","Data":"bacde4491a4e016225eb8fedbfa0a750f9361a8732c90845843bea3c1126479d"} Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.990191 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.990630 4717 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.990843 4717 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.991074 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.991243 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf" gracePeriod=15 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.991277 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935" gracePeriod=15 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.991316 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71" gracePeriod=15 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.991368 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147" gracePeriod=15 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.991470 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0" gracePeriod=15 Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992031 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992227 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992244 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992253 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992260 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992274 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992280 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992288 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992294 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992302 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992308 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992316 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992322 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 21 21:50:23 crc kubenswrapper[4717]: E0221 21:50:23.992335 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992342 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992429 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992440 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992446 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992454 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992460 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 21 21:50:23 crc kubenswrapper[4717]: I0221 21:50:23.992472 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.033706 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.151278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.151719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.151967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.152010 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.152040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.152100 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.153283 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.153353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.254917 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.254980 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255004 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255076 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255082 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255121 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255129 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255141 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255193 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:24 crc kubenswrapper[4717]: I0221 21:50:24.255194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.623519 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.936722 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhgrt" event={"ID":"57258517-86da-432f-8123-e3af4325d01a","Type":"ContainerStarted","Data":"849aa1cfd553969fd60b16008938b2de12e731ca4af43a228ba3076aac15a045"} Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.938818 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.939142 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.939538 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.946538 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.949232 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.952987 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147" exitCode=0 Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.953008 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935" exitCode=0 Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.953018 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf" exitCode=0 Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.953027 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71" exitCode=2 Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.953136 4717 scope.go:117] "RemoveContainer" containerID="77feff495aa7ac951033a70d9644e0bc3cf13af9cbd53e4e1354418e980353dc" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.960291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbvfd" event={"ID":"38f581da-ba2e-469c-b3eb-745f5b14190e","Type":"ContainerStarted","Data":"53dca808ca584e927eb6f8a46f8b0cda53e11e407b30aad9ad46b15bb1618bcf"} Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.963686 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.964324 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.965137 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.965512 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.966040 4717 generic.go:334] "Generic (PLEG): container finished" podID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" containerID="8dfe27d2f9908c9888ca788054a73153cd44fbd4e15a55de1d50454f8f9ef383" exitCode=0 Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.966072 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"65a6a817-09f9-4c86-a91a-6d4b695cedd1","Type":"ContainerDied","Data":"8dfe27d2f9908c9888ca788054a73153cd44fbd4e15a55de1d50454f8f9ef383"} Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.966586 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.966743 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.967023 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.967580 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:24.968092 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: E0221 21:50:25.645912 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896616d565cd8e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 21:50:25.645123814 +0000 UTC m=+240.426657436,LastTimestamp:2026-02-21 21:50:25.645123814 +0000 UTC m=+240.426657436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.985966 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.987201 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.987688 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.988078 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.988249 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.988409 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.988928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dljqt" event={"ID":"175e04b6-74b9-44e7-94ce-950ffeb16677","Type":"ContainerStarted","Data":"84b8f408b956c22bc6c5b11ab00339ea7994c62ca06c1d83c7278c174f1b9062"} Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.988952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"95a1136d95aa74f6dc4a3c0276bb554ac7853039c588917ff04c07823b778cce"} Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.988963 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6d6h2" event={"ID":"faf0d743-c229-4c65-b34e-8220f1bb1cc1","Type":"ContainerStarted","Data":"11d14b411dd101ca61c9c54b969a52bbe22ea6ba209f5d101b6d4c4b15afc6f8"} Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.989087 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.990171 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.993636 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.994107 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.995354 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.995549 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 21:50:25 crc kubenswrapper[4717]: I0221 21:50:25.995927 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.276588 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.277277 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.277841 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.278147 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.278376 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.278401 4717 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.278589 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.479838 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.574052 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.575214 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.575770 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.576149 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.576480 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.576889 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.577211 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.581582 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.583088 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.583600 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.583762 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.583936 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.584082 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.584239 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.584404 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.584594 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kubelet-dir\") pod \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733425 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kube-api-access\") pod \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733505 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-var-lock\") pod \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\" (UID: \"65a6a817-09f9-4c86-a91a-6d4b695cedd1\") " Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733500 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "65a6a817-09f9-4c86-a91a-6d4b695cedd1" (UID: "65a6a817-09f9-4c86-a91a-6d4b695cedd1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733524 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733567 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733591 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733675 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733625 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-var-lock" (OuterVolumeSpecName: "var-lock") pod "65a6a817-09f9-4c86-a91a-6d4b695cedd1" (UID: "65a6a817-09f9-4c86-a91a-6d4b695cedd1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.733717 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.734020 4717 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-var-lock\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.734032 4717 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.734040 4717 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.734048 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.734056 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.739659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "65a6a817-09f9-4c86-a91a-6d4b695cedd1" (UID: "65a6a817-09f9-4c86-a91a-6d4b695cedd1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:50:26 crc kubenswrapper[4717]: I0221 21:50:26.834766 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65a6a817-09f9-4c86-a91a-6d4b695cedd1-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 21 21:50:26 crc kubenswrapper[4717]: E0221 21:50:26.880651 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.006605 4717 generic.go:334] "Generic (PLEG): container finished" podID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" containerID="11d14b411dd101ca61c9c54b969a52bbe22ea6ba209f5d101b6d4c4b15afc6f8" exitCode=0 Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.006661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6d6h2" event={"ID":"faf0d743-c229-4c65-b34e-8220f1bb1cc1","Type":"ContainerDied","Data":"11d14b411dd101ca61c9c54b969a52bbe22ea6ba209f5d101b6d4c4b15afc6f8"} Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.007404 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.007672 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.008005 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.008338 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.008553 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.008589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"65a6a817-09f9-4c86-a91a-6d4b695cedd1","Type":"ContainerDied","Data":"d26df56327ff1083375876437479f3afdad5dd5d6c62fdc8df5691cdbf5d4656"} Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.008971 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d26df56327ff1083375876437479f3afdad5dd5d6c62fdc8df5691cdbf5d4656" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.008659 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.009076 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.009751 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.010885 4717 generic.go:334] "Generic (PLEG): container finished" podID="175e04b6-74b9-44e7-94ce-950ffeb16677" containerID="84b8f408b956c22bc6c5b11ab00339ea7994c62ca06c1d83c7278c174f1b9062" exitCode=0 Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.010964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dljqt" event={"ID":"175e04b6-74b9-44e7-94ce-950ffeb16677","Type":"ContainerDied","Data":"84b8f408b956c22bc6c5b11ab00339ea7994c62ca06c1d83c7278c174f1b9062"} Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.011396 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.011693 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.011941 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.012220 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.012585 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.013359 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.013949 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.014100 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8cff732a3db53445e1fc44f977b3631c8cd4c2f5bf1cdfa1c8b21a54e531b87f"} Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.016022 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.016352 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.016895 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.017072 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.017492 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.018058 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.018263 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.022027 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.024178 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0" exitCode=0 Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.024234 4717 scope.go:117] "RemoveContainer" containerID="a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.024271 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.036705 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.038160 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.038443 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.038621 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.038931 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.039123 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.039382 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.039783 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.040086 4717 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.040410 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.041246 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.041519 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.042214 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.042465 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.050179 4717 scope.go:117] "RemoveContainer" containerID="23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.070509 4717 scope.go:117] "RemoveContainer" containerID="a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.089925 4717 scope.go:117] "RemoveContainer" containerID="c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.123324 4717 scope.go:117] "RemoveContainer" containerID="6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.147647 4717 scope.go:117] "RemoveContainer" containerID="0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.188706 4717 scope.go:117] "RemoveContainer" containerID="a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.189943 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\": container with ID starting with a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147 not found: ID does not exist" containerID="a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.190013 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147"} err="failed to get container status \"a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\": rpc error: code = NotFound desc = could not find container \"a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147\": container with ID starting with a256bf9190d207cc0fd3aa2bd9a1a8547b7b1ab608f9237219b20f797b3f3147 not found: ID does not exist" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.190041 4717 scope.go:117] "RemoveContainer" containerID="23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.190502 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\": container with ID starting with 23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935 not found: ID does not exist" containerID="23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.190557 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935"} err="failed to get container status \"23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\": rpc error: code = NotFound desc = could not find container \"23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935\": container with ID starting with 23b5d4433a5ca8772798db362d995480f0f5517d2ccc6a34ae6fce9d7c3b2935 not found: ID does not exist" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.190586 4717 scope.go:117] "RemoveContainer" containerID="a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.190895 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\": container with ID starting with a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf not found: ID does not exist" containerID="a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.190922 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf"} err="failed to get container status \"a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\": rpc error: code = NotFound desc = could not find container \"a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf\": container with ID starting with a2936cf3be45a448983c90f70d04dc3f2965fef6922c6407855e39194599cfcf not found: ID does not exist" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.190939 4717 scope.go:117] "RemoveContainer" containerID="c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.191246 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\": container with ID starting with c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71 not found: ID does not exist" containerID="c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.191293 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71"} err="failed to get container status \"c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\": rpc error: code = NotFound desc = could not find container \"c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71\": container with ID starting with c825be628d1a959c5d5ba080a09897eb19ed6106b513ebb11af7e7e93efb7b71 not found: ID does not exist" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.191320 4717 scope.go:117] "RemoveContainer" containerID="6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.191563 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\": container with ID starting with 6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0 not found: ID does not exist" containerID="6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.191587 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0"} err="failed to get container status \"6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\": rpc error: code = NotFound desc = could not find container \"6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0\": container with ID starting with 6b265c830f0298d2554cf43b12f219eb7ccd5e6a3cfaacbd4d7e7e1d1e89dbb0 not found: ID does not exist" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.191601 4717 scope.go:117] "RemoveContainer" containerID="0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.191846 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\": container with ID starting with 0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234 not found: ID does not exist" containerID="0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.191912 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234"} err="failed to get container status \"0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\": rpc error: code = NotFound desc = could not find container \"0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234\": container with ID starting with 0fb7a6a4a6982ebe8c8eea7945c2a0465d0d6b50b06bf46dc88fd76dc05e3234 not found: ID does not exist" Feb 21 21:50:27 crc kubenswrapper[4717]: E0221 21:50:27.681455 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Feb 21 21:50:27 crc kubenswrapper[4717]: I0221 21:50:27.982199 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.033103 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dljqt" event={"ID":"175e04b6-74b9-44e7-94ce-950ffeb16677","Type":"ContainerStarted","Data":"47f8bc49f039904ee310a960a3628b4b0aaad4301c47b00139559adc01e913da"} Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.034006 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.034280 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.034590 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.035051 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.035447 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:28 crc kubenswrapper[4717]: I0221 21:50:28.035692 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:28 crc kubenswrapper[4717]: E0221 21:50:28.324546 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896616d565cd8e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 21:50:25.645123814 +0000 UTC m=+240.426657436,LastTimestamp:2026-02-21 21:50:25.645123814 +0000 UTC m=+240.426657436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 21:50:29 crc kubenswrapper[4717]: E0221 21:50:29.282394 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.842199 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.842259 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.888189 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.888843 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.889422 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.889713 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.890027 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.890427 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:29 crc kubenswrapper[4717]: I0221 21:50:29.890947 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.037977 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.038021 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.043916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6d6h2" event={"ID":"faf0d743-c229-4c65-b34e-8220f1bb1cc1","Type":"ContainerStarted","Data":"a64f7bc2e4451eaeed10ce0571c4276e37b89ea64f7f4f028351a49e378a54e4"} Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.044710 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.045082 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.047080 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.047313 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.047588 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.048017 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.088646 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhgrt" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.089290 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.089631 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.089878 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.090176 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.090425 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:30 crc kubenswrapper[4717]: I0221 21:50:30.090707 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:31 crc kubenswrapper[4717]: I0221 21:50:31.073605 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbvfd" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" containerName="registry-server" probeResult="failure" output=< Feb 21 21:50:31 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 21:50:31 crc kubenswrapper[4717]: > Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.229978 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.230641 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.295103 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.295758 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.296157 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.296620 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.297415 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.297778 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.298136 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.449084 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.449703 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:32 crc kubenswrapper[4717]: E0221 21:50:32.484379 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="6.4s" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.502511 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.503279 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.503698 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.504213 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.504712 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.505206 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:32 crc kubenswrapper[4717]: I0221 21:50:32.505762 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.100892 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dljqt" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.101302 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.101606 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.101834 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.102034 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.102179 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:33 crc kubenswrapper[4717]: I0221 21:50:33.102326 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.120184 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6d6h2" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.122099 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.123299 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.123787 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.124174 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.124626 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:34 crc kubenswrapper[4717]: I0221 21:50:34.125162 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:35 crc kubenswrapper[4717]: I0221 21:50:35.982318 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:35 crc kubenswrapper[4717]: I0221 21:50:35.983091 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:35 crc kubenswrapper[4717]: I0221 21:50:35.983540 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:35 crc kubenswrapper[4717]: I0221 21:50:35.983950 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:35 crc kubenswrapper[4717]: I0221 21:50:35.984322 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:35 crc kubenswrapper[4717]: I0221 21:50:35.984650 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:36 crc kubenswrapper[4717]: E0221 21:50:36.081237 4717 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" volumeName="registry-storage" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.976251 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.977919 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.978740 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.978972 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.979143 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.979304 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:37 crc kubenswrapper[4717]: I0221 21:50:37.979483 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:38 crc kubenswrapper[4717]: I0221 21:50:38.000966 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:38 crc kubenswrapper[4717]: I0221 21:50:38.000994 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:38 crc kubenswrapper[4717]: E0221 21:50:38.001580 4717 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:38 crc kubenswrapper[4717]: I0221 21:50:38.002464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:38 crc kubenswrapper[4717]: W0221 21:50:38.020748 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b5ff55efd3f79baf4949ab449e5d4399b2f68cd0bf57d6f16dbb2b9ad85dd6c7 WatchSource:0}: Error finding container b5ff55efd3f79baf4949ab449e5d4399b2f68cd0bf57d6f16dbb2b9ad85dd6c7: Status 404 returned error can't find the container with id b5ff55efd3f79baf4949ab449e5d4399b2f68cd0bf57d6f16dbb2b9ad85dd6c7 Feb 21 21:50:38 crc kubenswrapper[4717]: I0221 21:50:38.101179 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b5ff55efd3f79baf4949ab449e5d4399b2f68cd0bf57d6f16dbb2b9ad85dd6c7"} Feb 21 21:50:38 crc kubenswrapper[4717]: E0221 21:50:38.326026 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896616d565cd8e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-21 21:50:25.645123814 +0000 UTC m=+240.426657436,LastTimestamp:2026-02-21 21:50:25.645123814 +0000 UTC m=+240.426657436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 21 21:50:38 crc kubenswrapper[4717]: E0221 21:50:38.885462 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="7s" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.108303 4717 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3a9e8eb69136d0a973bfe5c7d16a5fe1ec36056bc3e263f4fdacca165580f5a2" exitCode=0 Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.108370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3a9e8eb69136d0a973bfe5c7d16a5fe1ec36056bc3e263f4fdacca165580f5a2"} Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.108643 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.108678 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:39 crc kubenswrapper[4717]: E0221 21:50:39.109208 4717 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.110024 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.110652 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.111391 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.112527 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.113305 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.113852 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.115288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.115329 4717 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd" exitCode=1 Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.115350 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd"} Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.115799 4717 scope.go:117] "RemoveContainer" containerID="8366347c92468c3b97fe868a57271d30441d5b65109627d6005f28e11b3957fd" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.117429 4717 status_manager.go:851] "Failed to get status for pod" podUID="38f581da-ba2e-469c-b3eb-745f5b14190e" pod="openshift-marketplace/redhat-operators-pbvfd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pbvfd\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.118112 4717 status_manager.go:851] "Failed to get status for pod" podUID="faf0d743-c229-4c65-b34e-8220f1bb1cc1" pod="openshift-marketplace/certified-operators-6d6h2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-6d6h2\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.118723 4717 status_manager.go:851] "Failed to get status for pod" podUID="175e04b6-74b9-44e7-94ce-950ffeb16677" pod="openshift-marketplace/community-operators-dljqt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dljqt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.119150 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.119540 4717 status_manager.go:851] "Failed to get status for pod" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.119954 4717 status_manager.go:851] "Failed to get status for pod" podUID="57258517-86da-432f-8123-e3af4325d01a" pod="openshift-marketplace/redhat-marketplace-dhgrt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dhgrt\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:39 crc kubenswrapper[4717]: I0221 21:50:39.120333 4717 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.104743 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.123662 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1a3ffc142884ce64f5266593ccc29e45a795b52a382ae6dff789e0326c288aac"} Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.123718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3c3eb32e3014729b0a030cae66bf8aceaaeb58481edc7b3d6e72c5a3ed364994"} Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.123727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"005e2d1b53bdce51245b5b143abec4632745de11b3aeab3a7079aa34c37c60c1"} Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.131933 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.132015 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e32f59f9dbcbd66b6a68526750e16ee1f9b92dcf589d39d64776c5f28a5651af"} Feb 21 21:50:40 crc kubenswrapper[4717]: I0221 21:50:40.165528 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbvfd" Feb 21 21:50:41 crc kubenswrapper[4717]: I0221 21:50:41.139022 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2250911efcf3de8b614c330bb987511490a42fbc6a2087c23a3fc5047635e241"} Feb 21 21:50:41 crc kubenswrapper[4717]: I0221 21:50:41.139055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f518412146e3de8d4fe7948dde2937c2301f800753b5156324a839a66f2d4e80"} Feb 21 21:50:41 crc kubenswrapper[4717]: I0221 21:50:41.139287 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:41 crc kubenswrapper[4717]: I0221 21:50:41.139300 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:41 crc kubenswrapper[4717]: I0221 21:50:41.139471 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:43 crc kubenswrapper[4717]: I0221 21:50:43.002992 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:43 crc kubenswrapper[4717]: I0221 21:50:43.003330 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:43 crc kubenswrapper[4717]: I0221 21:50:43.012605 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:45 crc kubenswrapper[4717]: I0221 21:50:45.169458 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:50:46 crc kubenswrapper[4717]: I0221 21:50:46.174646 4717 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:50:46 crc kubenswrapper[4717]: I0221 21:50:46.247813 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:50:46 crc kubenswrapper[4717]: I0221 21:50:46.252624 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:50:46 crc kubenswrapper[4717]: I0221 21:50:46.329274 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e8634914-c406-4a1f-b9bb-e1895404be7f" Feb 21 21:50:47 crc kubenswrapper[4717]: I0221 21:50:47.183141 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:47 crc kubenswrapper[4717]: I0221 21:50:47.183190 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8fdcf84-b93a-45e0-aaf0-170c282e61d5" Feb 21 21:50:47 crc kubenswrapper[4717]: I0221 21:50:47.186111 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e8634914-c406-4a1f-b9bb-e1895404be7f" Feb 21 21:50:55 crc kubenswrapper[4717]: I0221 21:50:55.175768 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.301683 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.308699 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.331505 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.567237 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.626237 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.647357 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 21 21:50:56 crc kubenswrapper[4717]: I0221 21:50:56.919744 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 21 21:50:57 crc kubenswrapper[4717]: I0221 21:50:57.315407 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 21 21:50:57 crc kubenswrapper[4717]: I0221 21:50:57.442373 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 21 21:50:57 crc kubenswrapper[4717]: I0221 21:50:57.667739 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 21 21:50:57 crc kubenswrapper[4717]: I0221 21:50:57.908747 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 21 21:50:57 crc kubenswrapper[4717]: I0221 21:50:57.940639 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.308923 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.382071 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.442438 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.537576 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.621391 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.773325 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.837010 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.838318 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.876251 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.908318 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.923380 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.934132 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 21 21:50:58 crc kubenswrapper[4717]: I0221 21:50:58.978295 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.045439 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.053969 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.199358 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.238916 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.246224 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.421707 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.494190 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.506991 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.527585 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.681397 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.719699 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.761332 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.768416 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.815687 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.831120 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.844460 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.920930 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 21 21:50:59 crc kubenswrapper[4717]: I0221 21:50:59.951725 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.095419 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.100796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.282996 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.317966 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.367193 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.403508 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.466370 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.483260 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.588499 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.605220 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.661031 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.777757 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.777803 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.867525 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.915413 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 21 21:51:00 crc kubenswrapper[4717]: I0221 21:51:00.958028 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.009841 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.017786 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.098619 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.119703 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.196993 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.379545 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.379588 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.410056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.474552 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.490215 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.503571 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.634734 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.776521 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.781136 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.862171 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.920629 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.925000 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.966555 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.978794 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 21 21:51:01 crc kubenswrapper[4717]: I0221 21:51:01.989437 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.086336 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.087915 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.167027 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.195057 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.200472 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.329849 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.472425 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.531234 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.533184 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.657210 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.718195 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.793304 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.880561 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.939688 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.946395 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.956494 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 21 21:51:02 crc kubenswrapper[4717]: I0221 21:51:02.995226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.022304 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.058512 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.087968 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.097807 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.155031 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.222312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.283006 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.414734 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.484912 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.542910 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.573041 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.651401 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.660104 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.729074 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 21 21:51:03 crc kubenswrapper[4717]: I0221 21:51:03.800534 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.023565 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.148998 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.267927 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.304101 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.352238 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.356607 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.357493 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.368286 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.572626 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.646264 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.718338 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.788370 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.802192 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 21 21:51:04 crc kubenswrapper[4717]: I0221 21:51:04.818389 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.007379 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.116374 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.116585 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.135938 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.164404 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.345521 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.359844 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.430348 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.452370 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.497092 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.596190 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.657311 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.664635 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.699753 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.700017 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.701694 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.736164 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.804384 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.807540 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.807932 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dljqt" podStartSLOduration=40.047767234 podStartE2EDuration="43.807905365s" podCreationTimestamp="2026-02-21 21:50:22 +0000 UTC" firstStartedPulling="2026-02-21 21:50:23.662835352 +0000 UTC m=+238.444368974" lastFinishedPulling="2026-02-21 21:50:27.422973483 +0000 UTC m=+242.204507105" observedRunningTime="2026-02-21 21:50:46.289477211 +0000 UTC m=+261.071010833" watchObservedRunningTime="2026-02-21 21:51:05.807905365 +0000 UTC m=+280.589439027" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.808656 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbvfd" podStartSLOduration=44.206478053 podStartE2EDuration="46.80864492s" podCreationTimestamp="2026-02-21 21:50:19 +0000 UTC" firstStartedPulling="2026-02-21 21:50:20.611420357 +0000 UTC m=+235.392954019" lastFinishedPulling="2026-02-21 21:50:23.213587264 +0000 UTC m=+237.995120886" observedRunningTime="2026-02-21 21:50:46.258740526 +0000 UTC m=+261.040274158" watchObservedRunningTime="2026-02-21 21:51:05.80864492 +0000 UTC m=+280.590178582" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.812538 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.812842 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6d6h2" podStartSLOduration=40.810829866 podStartE2EDuration="44.812826286s" podCreationTimestamp="2026-02-21 21:50:21 +0000 UTC" firstStartedPulling="2026-02-21 21:50:23.667347484 +0000 UTC m=+238.448881106" lastFinishedPulling="2026-02-21 21:50:27.669343884 +0000 UTC m=+242.450877526" observedRunningTime="2026-02-21 21:50:46.273529931 +0000 UTC m=+261.055063553" watchObservedRunningTime="2026-02-21 21:51:05.812826286 +0000 UTC m=+280.594359918" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.813655 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.813646973 podStartE2EDuration="41.813646973s" podCreationTimestamp="2026-02-21 21:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:50:46.136050383 +0000 UTC m=+260.917584005" watchObservedRunningTime="2026-02-21 21:51:05.813646973 +0000 UTC m=+280.595180605" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.815071 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhgrt" podStartSLOduration=44.150904254 podStartE2EDuration="46.815059272s" podCreationTimestamp="2026-02-21 21:50:19 +0000 UTC" firstStartedPulling="2026-02-21 21:50:20.609785311 +0000 UTC m=+235.391318933" lastFinishedPulling="2026-02-21 21:50:23.273940329 +0000 UTC m=+238.055473951" observedRunningTime="2026-02-21 21:50:46.214150519 +0000 UTC m=+260.995684141" watchObservedRunningTime="2026-02-21 21:51:05.815059272 +0000 UTC m=+280.596592904" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.815684 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.815730 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.827132 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.854099 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.854079236 podStartE2EDuration="19.854079236s" podCreationTimestamp="2026-02-21 21:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:51:05.836042944 +0000 UTC m=+280.617576566" watchObservedRunningTime="2026-02-21 21:51:05.854079236 +0000 UTC m=+280.635612878" Feb 21 21:51:05 crc kubenswrapper[4717]: I0221 21:51:05.911380 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.021972 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.024424 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.035712 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.092265 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.233491 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.233656 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.258728 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.319612 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.403380 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.456883 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.485749 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.509649 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.541909 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.618852 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.733665 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.765096 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.769453 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.824225 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.838665 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.897453 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.917125 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 21 21:51:06 crc kubenswrapper[4717]: I0221 21:51:06.963966 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.019166 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.069810 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.144032 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.162758 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.194835 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.251883 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.322757 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.324427 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.386135 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.423005 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.423544 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.451818 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.478271 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.614668 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.704042 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.834973 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.875812 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 21 21:51:07 crc kubenswrapper[4717]: I0221 21:51:07.978667 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.041914 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.059562 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.072195 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.075491 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.208706 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.223200 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.274972 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.560601 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.585273 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.610123 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.640252 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.640443 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.691605 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.695355 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.715340 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.748113 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.748828 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.817241 4717 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.817538 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://8cff732a3db53445e1fc44f977b3631c8cd4c2f5bf1cdfa1c8b21a54e531b87f" gracePeriod=5 Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.923137 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 21 21:51:08 crc kubenswrapper[4717]: I0221 21:51:08.986607 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.013657 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.081903 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.162391 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.189591 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.225246 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.437121 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.510434 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.645637 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.709634 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.809409 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.851674 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 21 21:51:09 crc kubenswrapper[4717]: I0221 21:51:09.857913 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.131999 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.146922 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.149098 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.157177 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.228286 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.271667 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.294751 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.353419 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.432006 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.481232 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.500162 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.537181 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.554517 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.572032 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.617435 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.647446 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.702457 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 21 21:51:10 crc kubenswrapper[4717]: I0221 21:51:10.772589 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.126399 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.212481 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.348238 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.456020 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.543612 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.627199 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.639172 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.651955 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.747327 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.782631 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.843175 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 21 21:51:11 crc kubenswrapper[4717]: I0221 21:51:11.889739 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 21 21:51:12 crc kubenswrapper[4717]: I0221 21:51:12.095154 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 21 21:51:12 crc kubenswrapper[4717]: I0221 21:51:12.497831 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 21 21:51:13 crc kubenswrapper[4717]: I0221 21:51:13.361045 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.370590 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.370964 4717 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="8cff732a3db53445e1fc44f977b3631c8cd4c2f5bf1cdfa1c8b21a54e531b87f" exitCode=137 Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.371019 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95a1136d95aa74f6dc4a3c0276bb554ac7853039c588917ff04c07823b778cce" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.418432 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.418515 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546345 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546526 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546510 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546612 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546691 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.546785 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.547292 4717 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.547336 4717 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.547363 4717 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.547389 4717 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.558389 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:51:14 crc kubenswrapper[4717]: I0221 21:51:14.648499 4717 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:15 crc kubenswrapper[4717]: I0221 21:51:15.378270 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 21 21:51:16 crc kubenswrapper[4717]: I0221 21:51:16.004202 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 21 21:51:16 crc kubenswrapper[4717]: I0221 21:51:16.005152 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 21 21:51:16 crc kubenswrapper[4717]: I0221 21:51:16.024180 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 21:51:16 crc kubenswrapper[4717]: I0221 21:51:16.024254 4717 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c3d752d0-7de0-4c90-a9b2-c081dd850e04" Feb 21 21:51:16 crc kubenswrapper[4717]: I0221 21:51:16.033577 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 21 21:51:16 crc kubenswrapper[4717]: I0221 21:51:16.033691 4717 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c3d752d0-7de0-4c90-a9b2-c081dd850e04" Feb 21 21:51:21 crc kubenswrapper[4717]: I0221 21:51:21.115017 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 21:51:25 crc kubenswrapper[4717]: I0221 21:51:25.737380 4717 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 21 21:51:28 crc kubenswrapper[4717]: I0221 21:51:28.362321 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 21 21:51:31 crc kubenswrapper[4717]: I0221 21:51:31.496541 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.414960 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ltlhm"] Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.415815 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" podUID="86542c4e-7f2b-4933-9d0d-737228524851" containerName="controller-manager" containerID="cri-o://39d1cbfc87adf21f27f9798085ed1c90567909a1a41bc56cc73e84d2901c2272" gracePeriod=30 Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.422489 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9"] Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.422735 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" podUID="d105add5-0618-41cf-ae5d-59b739694e4c" containerName="route-controller-manager" containerID="cri-o://8347825bb77d0cf07347a3a1246851aa0cd174bca5318a07ff31ee53765d7c74" gracePeriod=30 Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.692290 4717 generic.go:334] "Generic (PLEG): container finished" podID="d105add5-0618-41cf-ae5d-59b739694e4c" containerID="8347825bb77d0cf07347a3a1246851aa0cd174bca5318a07ff31ee53765d7c74" exitCode=0 Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.692386 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" event={"ID":"d105add5-0618-41cf-ae5d-59b739694e4c","Type":"ContainerDied","Data":"8347825bb77d0cf07347a3a1246851aa0cd174bca5318a07ff31ee53765d7c74"} Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.696652 4717 generic.go:334] "Generic (PLEG): container finished" podID="86542c4e-7f2b-4933-9d0d-737228524851" containerID="39d1cbfc87adf21f27f9798085ed1c90567909a1a41bc56cc73e84d2901c2272" exitCode=0 Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.696697 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" event={"ID":"86542c4e-7f2b-4933-9d0d-737228524851","Type":"ContainerDied","Data":"39d1cbfc87adf21f27f9798085ed1c90567909a1a41bc56cc73e84d2901c2272"} Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.802293 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.814111 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944382 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d105add5-0618-41cf-ae5d-59b739694e4c-serving-cert\") pod \"d105add5-0618-41cf-ae5d-59b739694e4c\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944777 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-config\") pod \"86542c4e-7f2b-4933-9d0d-737228524851\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f65fg\" (UniqueName: \"kubernetes.io/projected/86542c4e-7f2b-4933-9d0d-737228524851-kube-api-access-f65fg\") pod \"86542c4e-7f2b-4933-9d0d-737228524851\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944892 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86542c4e-7f2b-4933-9d0d-737228524851-serving-cert\") pod \"86542c4e-7f2b-4933-9d0d-737228524851\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-config\") pod \"d105add5-0618-41cf-ae5d-59b739694e4c\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944951 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-client-ca\") pod \"d105add5-0618-41cf-ae5d-59b739694e4c\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.944979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-proxy-ca-bundles\") pod \"86542c4e-7f2b-4933-9d0d-737228524851\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.945034 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chz4k\" (UniqueName: \"kubernetes.io/projected/d105add5-0618-41cf-ae5d-59b739694e4c-kube-api-access-chz4k\") pod \"d105add5-0618-41cf-ae5d-59b739694e4c\" (UID: \"d105add5-0618-41cf-ae5d-59b739694e4c\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.945067 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-client-ca\") pod \"86542c4e-7f2b-4933-9d0d-737228524851\" (UID: \"86542c4e-7f2b-4933-9d0d-737228524851\") " Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.945899 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d105add5-0618-41cf-ae5d-59b739694e4c" (UID: "d105add5-0618-41cf-ae5d-59b739694e4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.945919 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-config" (OuterVolumeSpecName: "config") pod "d105add5-0618-41cf-ae5d-59b739694e4c" (UID: "d105add5-0618-41cf-ae5d-59b739694e4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.945950 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-client-ca" (OuterVolumeSpecName: "client-ca") pod "86542c4e-7f2b-4933-9d0d-737228524851" (UID: "86542c4e-7f2b-4933-9d0d-737228524851"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.945936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "86542c4e-7f2b-4933-9d0d-737228524851" (UID: "86542c4e-7f2b-4933-9d0d-737228524851"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.946562 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-config" (OuterVolumeSpecName: "config") pod "86542c4e-7f2b-4933-9d0d-737228524851" (UID: "86542c4e-7f2b-4933-9d0d-737228524851"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.951433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86542c4e-7f2b-4933-9d0d-737228524851-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86542c4e-7f2b-4933-9d0d-737228524851" (UID: "86542c4e-7f2b-4933-9d0d-737228524851"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.951974 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d105add5-0618-41cf-ae5d-59b739694e4c-kube-api-access-chz4k" (OuterVolumeSpecName: "kube-api-access-chz4k") pod "d105add5-0618-41cf-ae5d-59b739694e4c" (UID: "d105add5-0618-41cf-ae5d-59b739694e4c"). InnerVolumeSpecName "kube-api-access-chz4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.952494 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86542c4e-7f2b-4933-9d0d-737228524851-kube-api-access-f65fg" (OuterVolumeSpecName: "kube-api-access-f65fg") pod "86542c4e-7f2b-4933-9d0d-737228524851" (UID: "86542c4e-7f2b-4933-9d0d-737228524851"). InnerVolumeSpecName "kube-api-access-f65fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:51:38 crc kubenswrapper[4717]: I0221 21:51:38.953378 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d105add5-0618-41cf-ae5d-59b739694e4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d105add5-0618-41cf-ae5d-59b739694e4c" (UID: "d105add5-0618-41cf-ae5d-59b739694e4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046161 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046211 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f65fg\" (UniqueName: \"kubernetes.io/projected/86542c4e-7f2b-4933-9d0d-737228524851-kube-api-access-f65fg\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046232 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86542c4e-7f2b-4933-9d0d-737228524851-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046251 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046268 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d105add5-0618-41cf-ae5d-59b739694e4c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046284 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046303 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chz4k\" (UniqueName: \"kubernetes.io/projected/d105add5-0618-41cf-ae5d-59b739694e4c-kube-api-access-chz4k\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046320 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86542c4e-7f2b-4933-9d0d-737228524851-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.046336 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d105add5-0618-41cf-ae5d-59b739694e4c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.706704 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" event={"ID":"86542c4e-7f2b-4933-9d0d-737228524851","Type":"ContainerDied","Data":"743bb0d19d88c2972cba9ba7b9cb1ce2f7b050aab0e7f0cedfad6820a225c28b"} Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.707121 4717 scope.go:117] "RemoveContainer" containerID="39d1cbfc87adf21f27f9798085ed1c90567909a1a41bc56cc73e84d2901c2272" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.706750 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ltlhm" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.709534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" event={"ID":"d105add5-0618-41cf-ae5d-59b739694e4c","Type":"ContainerDied","Data":"8b4cbe7d323243073b18369ba4c18078cc7cdc895798c88776a5032994e4432b"} Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.709646 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.743840 4717 scope.go:117] "RemoveContainer" containerID="8347825bb77d0cf07347a3a1246851aa0cd174bca5318a07ff31ee53765d7c74" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.758660 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ltlhm"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.763804 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ltlhm"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.776630 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.781158 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrbz9"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.813808 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw"] Feb 21 21:51:39 crc kubenswrapper[4717]: E0221 21:51:39.814819 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86542c4e-7f2b-4933-9d0d-737228524851" containerName="controller-manager" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.814962 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="86542c4e-7f2b-4933-9d0d-737228524851" containerName="controller-manager" Feb 21 21:51:39 crc kubenswrapper[4717]: E0221 21:51:39.815036 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.815088 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 21 21:51:39 crc kubenswrapper[4717]: E0221 21:51:39.815125 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d105add5-0618-41cf-ae5d-59b739694e4c" containerName="route-controller-manager" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.815175 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d105add5-0618-41cf-ae5d-59b739694e4c" containerName="route-controller-manager" Feb 21 21:51:39 crc kubenswrapper[4717]: E0221 21:51:39.815263 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" containerName="installer" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.815310 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" containerName="installer" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.815789 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.815856 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d105add5-0618-41cf-ae5d-59b739694e4c" containerName="route-controller-manager" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.815967 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="86542c4e-7f2b-4933-9d0d-737228524851" containerName="controller-manager" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.816005 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a6a817-09f9-4c86-a91a-6d4b695cedd1" containerName="installer" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.817085 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.820930 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.821729 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.822175 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.822527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.822728 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54c577d67c-rdb2n"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.823749 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.824633 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.825964 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.831019 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c577d67c-rdb2n"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.834793 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw"] Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.835066 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.835259 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.835316 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.835444 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.835796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.850319 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.876803 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.877251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-config\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.877344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-client-ca\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.877413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5cjv\" (UniqueName: \"kubernetes.io/projected/7a2eaf15-eb11-4818-9e0a-888ae2b19730-kube-api-access-l5cjv\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.877442 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2eaf15-eb11-4818-9e0a-888ae2b19730-serving-cert\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.877473 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-proxy-ca-bundles\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978090 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-client-ca\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-config\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978179 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-client-ca\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978541 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5cjv\" (UniqueName: \"kubernetes.io/projected/7a2eaf15-eb11-4818-9e0a-888ae2b19730-kube-api-access-l5cjv\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2eaf15-eb11-4818-9e0a-888ae2b19730-serving-cert\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978630 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrtf\" (UniqueName: \"kubernetes.io/projected/bffd34c0-8496-470b-b160-c7ea3a69eecd-kube-api-access-vjrtf\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978658 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-proxy-ca-bundles\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-config\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.978744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffd34c0-8496-470b-b160-c7ea3a69eecd-serving-cert\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.980469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-proxy-ca-bundles\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.980834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-client-ca\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.982035 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-config\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.984110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2eaf15-eb11-4818-9e0a-888ae2b19730-serving-cert\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.989188 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86542c4e-7f2b-4933-9d0d-737228524851" path="/var/lib/kubelet/pods/86542c4e-7f2b-4933-9d0d-737228524851/volumes" Feb 21 21:51:39 crc kubenswrapper[4717]: I0221 21:51:39.990844 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d105add5-0618-41cf-ae5d-59b739694e4c" path="/var/lib/kubelet/pods/d105add5-0618-41cf-ae5d-59b739694e4c/volumes" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.004903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5cjv\" (UniqueName: \"kubernetes.io/projected/7a2eaf15-eb11-4818-9e0a-888ae2b19730-kube-api-access-l5cjv\") pod \"controller-manager-54c577d67c-rdb2n\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.080321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-config\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.080384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-client-ca\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.080437 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrtf\" (UniqueName: \"kubernetes.io/projected/bffd34c0-8496-470b-b160-c7ea3a69eecd-kube-api-access-vjrtf\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.080504 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffd34c0-8496-470b-b160-c7ea3a69eecd-serving-cert\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.083145 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-client-ca\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.084831 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-config\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.088124 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffd34c0-8496-470b-b160-c7ea3a69eecd-serving-cert\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.119792 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrtf\" (UniqueName: \"kubernetes.io/projected/bffd34c0-8496-470b-b160-c7ea3a69eecd-kube-api-access-vjrtf\") pod \"route-controller-manager-666f7579cb-x78tw\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.192940 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:40 crc kubenswrapper[4717]: I0221 21:51:40.202852 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.129312 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54c577d67c-rdb2n"] Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.133665 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw"] Feb 21 21:51:41 crc kubenswrapper[4717]: W0221 21:51:41.147417 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffd34c0_8496_470b_b160_c7ea3a69eecd.slice/crio-95ca222fa9739d50b3c434a596a1de9332c26612ac7b08f0b62ba813ecb1352e WatchSource:0}: Error finding container 95ca222fa9739d50b3c434a596a1de9332c26612ac7b08f0b62ba813ecb1352e: Status 404 returned error can't find the container with id 95ca222fa9739d50b3c434a596a1de9332c26612ac7b08f0b62ba813ecb1352e Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.723998 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" event={"ID":"7a2eaf15-eb11-4818-9e0a-888ae2b19730","Type":"ContainerStarted","Data":"b00b5a03bb3c840369c6d9c97e1d7e226037d758e6cb6b261d1ea79a33f8716d"} Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.724417 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.724434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" event={"ID":"7a2eaf15-eb11-4818-9e0a-888ae2b19730","Type":"ContainerStarted","Data":"439b9212855e8ab0fde3855a9237e98b10f7b4fcb1917a39c20eab8bc3ad158e"} Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.725726 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" event={"ID":"bffd34c0-8496-470b-b160-c7ea3a69eecd","Type":"ContainerStarted","Data":"317f93d3863e5304a45d79f985ffc6d3c24d84ef7dbec3f40bd0b0d15c375815"} Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.725760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" event={"ID":"bffd34c0-8496-470b-b160-c7ea3a69eecd","Type":"ContainerStarted","Data":"95ca222fa9739d50b3c434a596a1de9332c26612ac7b08f0b62ba813ecb1352e"} Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.726007 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.754173 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" podStartSLOduration=3.754131654 podStartE2EDuration="3.754131654s" podCreationTimestamp="2026-02-21 21:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:51:41.752204045 +0000 UTC m=+316.533737667" watchObservedRunningTime="2026-02-21 21:51:41.754131654 +0000 UTC m=+316.535665276" Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.775489 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" podStartSLOduration=3.7754740829999998 podStartE2EDuration="3.775474083s" podCreationTimestamp="2026-02-21 21:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:51:41.774682737 +0000 UTC m=+316.556216359" watchObservedRunningTime="2026-02-21 21:51:41.775474083 +0000 UTC m=+316.557007705" Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.779788 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:41 crc kubenswrapper[4717]: I0221 21:51:41.855991 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:49 crc kubenswrapper[4717]: I0221 21:51:49.719133 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54c577d67c-rdb2n"] Feb 21 21:51:49 crc kubenswrapper[4717]: I0221 21:51:49.720132 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" podUID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" containerName="controller-manager" containerID="cri-o://b00b5a03bb3c840369c6d9c97e1d7e226037d758e6cb6b261d1ea79a33f8716d" gracePeriod=30 Feb 21 21:51:49 crc kubenswrapper[4717]: I0221 21:51:49.733955 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw"] Feb 21 21:51:49 crc kubenswrapper[4717]: I0221 21:51:49.734257 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" podUID="bffd34c0-8496-470b-b160-c7ea3a69eecd" containerName="route-controller-manager" containerID="cri-o://317f93d3863e5304a45d79f985ffc6d3c24d84ef7dbec3f40bd0b0d15c375815" gracePeriod=30 Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.787977 4717 generic.go:334] "Generic (PLEG): container finished" podID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" containerID="b00b5a03bb3c840369c6d9c97e1d7e226037d758e6cb6b261d1ea79a33f8716d" exitCode=0 Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.788035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" event={"ID":"7a2eaf15-eb11-4818-9e0a-888ae2b19730","Type":"ContainerDied","Data":"b00b5a03bb3c840369c6d9c97e1d7e226037d758e6cb6b261d1ea79a33f8716d"} Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.788652 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" event={"ID":"7a2eaf15-eb11-4818-9e0a-888ae2b19730","Type":"ContainerDied","Data":"439b9212855e8ab0fde3855a9237e98b10f7b4fcb1917a39c20eab8bc3ad158e"} Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.788678 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="439b9212855e8ab0fde3855a9237e98b10f7b4fcb1917a39c20eab8bc3ad158e" Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.791189 4717 generic.go:334] "Generic (PLEG): container finished" podID="bffd34c0-8496-470b-b160-c7ea3a69eecd" containerID="317f93d3863e5304a45d79f985ffc6d3c24d84ef7dbec3f40bd0b0d15c375815" exitCode=0 Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.791251 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" event={"ID":"bffd34c0-8496-470b-b160-c7ea3a69eecd","Type":"ContainerDied","Data":"317f93d3863e5304a45d79f985ffc6d3c24d84ef7dbec3f40bd0b0d15c375815"} Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.791291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" event={"ID":"bffd34c0-8496-470b-b160-c7ea3a69eecd","Type":"ContainerDied","Data":"95ca222fa9739d50b3c434a596a1de9332c26612ac7b08f0b62ba813ecb1352e"} Feb 21 21:51:50 crc kubenswrapper[4717]: I0221 21:51:50.791314 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95ca222fa9739d50b3c434a596a1de9332c26612ac7b08f0b62ba813ecb1352e" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.007449 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.028315 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-client-ca\") pod \"bffd34c0-8496-470b-b160-c7ea3a69eecd\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045550 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-config\") pod \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045570 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-config\") pod \"bffd34c0-8496-470b-b160-c7ea3a69eecd\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045605 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjrtf\" (UniqueName: \"kubernetes.io/projected/bffd34c0-8496-470b-b160-c7ea3a69eecd-kube-api-access-vjrtf\") pod \"bffd34c0-8496-470b-b160-c7ea3a69eecd\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045628 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2eaf15-eb11-4818-9e0a-888ae2b19730-serving-cert\") pod \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-proxy-ca-bundles\") pod \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045665 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffd34c0-8496-470b-b160-c7ea3a69eecd-serving-cert\") pod \"bffd34c0-8496-470b-b160-c7ea3a69eecd\" (UID: \"bffd34c0-8496-470b-b160-c7ea3a69eecd\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045682 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5cjv\" (UniqueName: \"kubernetes.io/projected/7a2eaf15-eb11-4818-9e0a-888ae2b19730-kube-api-access-l5cjv\") pod \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.045706 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-client-ca\") pod \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\" (UID: \"7a2eaf15-eb11-4818-9e0a-888ae2b19730\") " Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.047161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7a2eaf15-eb11-4818-9e0a-888ae2b19730" (UID: "7a2eaf15-eb11-4818-9e0a-888ae2b19730"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.047608 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-client-ca" (OuterVolumeSpecName: "client-ca") pod "bffd34c0-8496-470b-b160-c7ea3a69eecd" (UID: "bffd34c0-8496-470b-b160-c7ea3a69eecd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.050980 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-config" (OuterVolumeSpecName: "config") pod "7a2eaf15-eb11-4818-9e0a-888ae2b19730" (UID: "7a2eaf15-eb11-4818-9e0a-888ae2b19730"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.055423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2eaf15-eb11-4818-9e0a-888ae2b19730-kube-api-access-l5cjv" (OuterVolumeSpecName: "kube-api-access-l5cjv") pod "7a2eaf15-eb11-4818-9e0a-888ae2b19730" (UID: "7a2eaf15-eb11-4818-9e0a-888ae2b19730"). InnerVolumeSpecName "kube-api-access-l5cjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.055987 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bffd34c0-8496-470b-b160-c7ea3a69eecd-kube-api-access-vjrtf" (OuterVolumeSpecName: "kube-api-access-vjrtf") pod "bffd34c0-8496-470b-b160-c7ea3a69eecd" (UID: "bffd34c0-8496-470b-b160-c7ea3a69eecd"). InnerVolumeSpecName "kube-api-access-vjrtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.057041 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2eaf15-eb11-4818-9e0a-888ae2b19730-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a2eaf15-eb11-4818-9e0a-888ae2b19730" (UID: "7a2eaf15-eb11-4818-9e0a-888ae2b19730"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.057566 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a2eaf15-eb11-4818-9e0a-888ae2b19730" (UID: "7a2eaf15-eb11-4818-9e0a-888ae2b19730"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.061452 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffd34c0-8496-470b-b160-c7ea3a69eecd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bffd34c0-8496-470b-b160-c7ea3a69eecd" (UID: "bffd34c0-8496-470b-b160-c7ea3a69eecd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.062160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-config" (OuterVolumeSpecName: "config") pod "bffd34c0-8496-470b-b160-c7ea3a69eecd" (UID: "bffd34c0-8496-470b-b160-c7ea3a69eecd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.066207 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp"] Feb 21 21:51:51 crc kubenswrapper[4717]: E0221 21:51:51.066446 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" containerName="controller-manager" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.066465 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" containerName="controller-manager" Feb 21 21:51:51 crc kubenswrapper[4717]: E0221 21:51:51.066474 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffd34c0-8496-470b-b160-c7ea3a69eecd" containerName="route-controller-manager" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.066480 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffd34c0-8496-470b-b160-c7ea3a69eecd" containerName="route-controller-manager" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.066629 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" containerName="controller-manager" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.066648 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffd34c0-8496-470b-b160-c7ea3a69eecd" containerName="route-controller-manager" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.067026 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.069342 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp"] Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147140 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-client-ca\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-serving-cert\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-config\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147284 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgq2j\" (UniqueName: \"kubernetes.io/projected/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-kube-api-access-dgq2j\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147336 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjrtf\" (UniqueName: \"kubernetes.io/projected/bffd34c0-8496-470b-b160-c7ea3a69eecd-kube-api-access-vjrtf\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147364 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2eaf15-eb11-4818-9e0a-888ae2b19730-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147374 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147383 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffd34c0-8496-470b-b160-c7ea3a69eecd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147391 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5cjv\" (UniqueName: \"kubernetes.io/projected/7a2eaf15-eb11-4818-9e0a-888ae2b19730-kube-api-access-l5cjv\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147401 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147410 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147417 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2eaf15-eb11-4818-9e0a-888ae2b19730-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.147425 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffd34c0-8496-470b-b160-c7ea3a69eecd-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.194756 4717 patch_prober.go:28] interesting pod/route-controller-manager-666f7579cb-x78tw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.195072 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" podUID="bffd34c0-8496-470b-b160-c7ea3a69eecd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.203981 4717 patch_prober.go:28] interesting pod/controller-manager-54c577d67c-rdb2n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.204043 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" podUID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.248262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-client-ca\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.248333 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-serving-cert\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.248354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-config\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.248376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgq2j\" (UniqueName: \"kubernetes.io/projected/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-kube-api-access-dgq2j\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.250058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-client-ca\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.250519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-config\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.252826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-serving-cert\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.267403 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgq2j\" (UniqueName: \"kubernetes.io/projected/16f177a7-ed6e-43d7-b569-8143d1fcfe7a-kube-api-access-dgq2j\") pod \"route-controller-manager-756dcd98c-sd6wp\" (UID: \"16f177a7-ed6e-43d7-b569-8143d1fcfe7a\") " pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.397422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.775963 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp"] Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.797124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" event={"ID":"16f177a7-ed6e-43d7-b569-8143d1fcfe7a","Type":"ContainerStarted","Data":"1ef03f233fb12ab7ac29c687e9e3544453fa1482c3c84dbbb3e858711cbc2c3a"} Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.797173 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54c577d67c-rdb2n" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.797239 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.837429 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54c577d67c-rdb2n"] Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.850553 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54c577d67c-rdb2n"] Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.855228 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw"] Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.858161 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-666f7579cb-x78tw"] Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.984414 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2eaf15-eb11-4818-9e0a-888ae2b19730" path="/var/lib/kubelet/pods/7a2eaf15-eb11-4818-9e0a-888ae2b19730/volumes" Feb 21 21:51:51 crc kubenswrapper[4717]: I0221 21:51:51.985356 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bffd34c0-8496-470b-b160-c7ea3a69eecd" path="/var/lib/kubelet/pods/bffd34c0-8496-470b-b160-c7ea3a69eecd/volumes" Feb 21 21:51:52 crc kubenswrapper[4717]: I0221 21:51:52.803394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" event={"ID":"16f177a7-ed6e-43d7-b569-8143d1fcfe7a","Type":"ContainerStarted","Data":"757b601511c4b67b8cdec05e35a875d584070941b3ec1d50a526eabae91ff294"} Feb 21 21:51:52 crc kubenswrapper[4717]: I0221 21:51:52.803986 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:52 crc kubenswrapper[4717]: I0221 21:51:52.811429 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" Feb 21 21:51:52 crc kubenswrapper[4717]: I0221 21:51:52.829392 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-756dcd98c-sd6wp" podStartSLOduration=3.8293598319999997 podStartE2EDuration="3.829359832s" podCreationTimestamp="2026-02-21 21:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:51:52.822615034 +0000 UTC m=+327.604148656" watchObservedRunningTime="2026-02-21 21:51:52.829359832 +0000 UTC m=+327.610893464" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.820425 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c87769648-pjj7c"] Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.821673 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.827044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c87769648-pjj7c"] Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.833488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.833719 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.833997 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.834196 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.834769 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.841650 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.847331 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.980923 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9891c1e-cac4-4118-8362-357bede455b3-serving-cert\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.981110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-proxy-ca-bundles\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.981173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-config\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.981222 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-client-ca\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:53 crc kubenswrapper[4717]: I0221 21:51:53.981275 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmjm\" (UniqueName: \"kubernetes.io/projected/b9891c1e-cac4-4118-8362-357bede455b3-kube-api-access-wvmjm\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.082461 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9891c1e-cac4-4118-8362-357bede455b3-serving-cert\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.082578 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-proxy-ca-bundles\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.082629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-config\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.082665 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-client-ca\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.082716 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmjm\" (UniqueName: \"kubernetes.io/projected/b9891c1e-cac4-4118-8362-357bede455b3-kube-api-access-wvmjm\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.085339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-proxy-ca-bundles\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.085380 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-client-ca\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.086642 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-config\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.095466 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9891c1e-cac4-4118-8362-357bede455b3-serving-cert\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.118427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmjm\" (UniqueName: \"kubernetes.io/projected/b9891c1e-cac4-4118-8362-357bede455b3-kube-api-access-wvmjm\") pod \"controller-manager-5c87769648-pjj7c\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.162228 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.643953 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c87769648-pjj7c"] Feb 21 21:51:54 crc kubenswrapper[4717]: W0221 21:51:54.646780 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9891c1e_cac4_4118_8362_357bede455b3.slice/crio-e9a07347ac49e4119d61a886783d51f974cfb0f6565c92e0c8ff775c7dea1ee9 WatchSource:0}: Error finding container e9a07347ac49e4119d61a886783d51f974cfb0f6565c92e0c8ff775c7dea1ee9: Status 404 returned error can't find the container with id e9a07347ac49e4119d61a886783d51f974cfb0f6565c92e0c8ff775c7dea1ee9 Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.816927 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" event={"ID":"b9891c1e-cac4-4118-8362-357bede455b3","Type":"ContainerStarted","Data":"ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f"} Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.817019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" event={"ID":"b9891c1e-cac4-4118-8362-357bede455b3","Type":"ContainerStarted","Data":"e9a07347ac49e4119d61a886783d51f974cfb0f6565c92e0c8ff775c7dea1ee9"} Feb 21 21:51:54 crc kubenswrapper[4717]: I0221 21:51:54.839337 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" podStartSLOduration=5.839306581 podStartE2EDuration="5.839306581s" podCreationTimestamp="2026-02-21 21:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:51:54.83347922 +0000 UTC m=+329.615012852" watchObservedRunningTime="2026-02-21 21:51:54.839306581 +0000 UTC m=+329.620840233" Feb 21 21:51:55 crc kubenswrapper[4717]: I0221 21:51:55.822473 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:51:55 crc kubenswrapper[4717]: I0221 21:51:55.827156 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:52:09 crc kubenswrapper[4717]: I0221 21:52:09.063129 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:52:09 crc kubenswrapper[4717]: I0221 21:52:09.063774 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.337776 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c87769648-pjj7c"] Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.338630 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" podUID="b9891c1e-cac4-4118-8362-357bede455b3" containerName="controller-manager" containerID="cri-o://ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f" gracePeriod=30 Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.954989 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.973982 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9891c1e-cac4-4118-8362-357bede455b3" containerID="ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f" exitCode=0 Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.974069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" event={"ID":"b9891c1e-cac4-4118-8362-357bede455b3","Type":"ContainerDied","Data":"ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f"} Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.974121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" event={"ID":"b9891c1e-cac4-4118-8362-357bede455b3","Type":"ContainerDied","Data":"e9a07347ac49e4119d61a886783d51f974cfb0f6565c92e0c8ff775c7dea1ee9"} Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.974160 4717 scope.go:117] "RemoveContainer" containerID="ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.974334 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c87769648-pjj7c" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.985789 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-proxy-ca-bundles\") pod \"b9891c1e-cac4-4118-8362-357bede455b3\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.985853 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9891c1e-cac4-4118-8362-357bede455b3-serving-cert\") pod \"b9891c1e-cac4-4118-8362-357bede455b3\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.985968 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-client-ca\") pod \"b9891c1e-cac4-4118-8362-357bede455b3\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.986012 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-config\") pod \"b9891c1e-cac4-4118-8362-357bede455b3\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.986061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmjm\" (UniqueName: \"kubernetes.io/projected/b9891c1e-cac4-4118-8362-357bede455b3-kube-api-access-wvmjm\") pod \"b9891c1e-cac4-4118-8362-357bede455b3\" (UID: \"b9891c1e-cac4-4118-8362-357bede455b3\") " Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.987129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b9891c1e-cac4-4118-8362-357bede455b3" (UID: "b9891c1e-cac4-4118-8362-357bede455b3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.988600 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "b9891c1e-cac4-4118-8362-357bede455b3" (UID: "b9891c1e-cac4-4118-8362-357bede455b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.988832 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-config" (OuterVolumeSpecName: "config") pod "b9891c1e-cac4-4118-8362-357bede455b3" (UID: "b9891c1e-cac4-4118-8362-357bede455b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.993397 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9891c1e-cac4-4118-8362-357bede455b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9891c1e-cac4-4118-8362-357bede455b3" (UID: "b9891c1e-cac4-4118-8362-357bede455b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:52:18 crc kubenswrapper[4717]: I0221 21:52:18.995602 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9891c1e-cac4-4118-8362-357bede455b3-kube-api-access-wvmjm" (OuterVolumeSpecName: "kube-api-access-wvmjm") pod "b9891c1e-cac4-4118-8362-357bede455b3" (UID: "b9891c1e-cac4-4118-8362-357bede455b3"). InnerVolumeSpecName "kube-api-access-wvmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.010861 4717 scope.go:117] "RemoveContainer" containerID="ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f" Feb 21 21:52:19 crc kubenswrapper[4717]: E0221 21:52:19.011744 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f\": container with ID starting with ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f not found: ID does not exist" containerID="ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.011796 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f"} err="failed to get container status \"ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f\": rpc error: code = NotFound desc = could not find container \"ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f\": container with ID starting with ec7b8547c05d70ac85dad4c26df47ddd8d69bccf65809432983beba4789d326f not found: ID does not exist" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.087626 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.087659 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9891c1e-cac4-4118-8362-357bede455b3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.087670 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.087681 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9891c1e-cac4-4118-8362-357bede455b3-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.087693 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvmjm\" (UniqueName: \"kubernetes.io/projected/b9891c1e-cac4-4118-8362-357bede455b3-kube-api-access-wvmjm\") on node \"crc\" DevicePath \"\"" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.318215 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c87769648-pjj7c"] Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.326952 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c87769648-pjj7c"] Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.841684 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cc797fb79-lkmf6"] Feb 21 21:52:19 crc kubenswrapper[4717]: E0221 21:52:19.842036 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9891c1e-cac4-4118-8362-357bede455b3" containerName="controller-manager" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.842057 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9891c1e-cac4-4118-8362-357bede455b3" containerName="controller-manager" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.842216 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9891c1e-cac4-4118-8362-357bede455b3" containerName="controller-manager" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.842771 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.847669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.847941 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.848365 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.849174 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.853149 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.853161 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.859174 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.868186 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cc797fb79-lkmf6"] Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.899669 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-client-ca\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.899771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956c1b4b-70e5-41af-b0b4-503404e425a8-serving-cert\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.899959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwmnp\" (UniqueName: \"kubernetes.io/projected/956c1b4b-70e5-41af-b0b4-503404e425a8-kube-api-access-lwmnp\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.900002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-proxy-ca-bundles\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.900031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-config\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:19 crc kubenswrapper[4717]: I0221 21:52:19.988600 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9891c1e-cac4-4118-8362-357bede455b3" path="/var/lib/kubelet/pods/b9891c1e-cac4-4118-8362-357bede455b3/volumes" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.001272 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-client-ca\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.001389 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956c1b4b-70e5-41af-b0b4-503404e425a8-serving-cert\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.001508 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwmnp\" (UniqueName: \"kubernetes.io/projected/956c1b4b-70e5-41af-b0b4-503404e425a8-kube-api-access-lwmnp\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.001563 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-proxy-ca-bundles\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.001610 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-config\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.003149 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-client-ca\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.003315 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-proxy-ca-bundles\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.004187 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956c1b4b-70e5-41af-b0b4-503404e425a8-config\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.007471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956c1b4b-70e5-41af-b0b4-503404e425a8-serving-cert\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.026186 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwmnp\" (UniqueName: \"kubernetes.io/projected/956c1b4b-70e5-41af-b0b4-503404e425a8-kube-api-access-lwmnp\") pod \"controller-manager-cc797fb79-lkmf6\" (UID: \"956c1b4b-70e5-41af-b0b4-503404e425a8\") " pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.188278 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.660984 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cc797fb79-lkmf6"] Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.989890 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" event={"ID":"956c1b4b-70e5-41af-b0b4-503404e425a8","Type":"ContainerStarted","Data":"aad2d25be9ab37973e45700f764899f6e9c9b14bc380aa1f1b516099fdb5183b"} Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.990227 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.990237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" event={"ID":"956c1b4b-70e5-41af-b0b4-503404e425a8","Type":"ContainerStarted","Data":"0af7c1c7e49e6ad940592029c6a5ed7a07e3c7ee6cf833d4933428c70c4849d5"} Feb 21 21:52:20 crc kubenswrapper[4717]: I0221 21:52:20.997992 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" Feb 21 21:52:21 crc kubenswrapper[4717]: I0221 21:52:21.030373 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cc797fb79-lkmf6" podStartSLOduration=3.030346042 podStartE2EDuration="3.030346042s" podCreationTimestamp="2026-02-21 21:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:52:21.0231736 +0000 UTC m=+355.804707262" watchObservedRunningTime="2026-02-21 21:52:21.030346042 +0000 UTC m=+355.811879694" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.412376 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-v24bc"] Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.413477 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.423950 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-v24bc"] Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583345 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-bound-sa-token\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583460 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/000cbac1-418f-401c-8b2c-d54e702657a3-trusted-ca\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/000cbac1-418f-401c-8b2c-d54e702657a3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583641 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sw77\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-kube-api-access-6sw77\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583676 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/000cbac1-418f-401c-8b2c-d54e702657a3-registry-certificates\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583774 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/000cbac1-418f-401c-8b2c-d54e702657a3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.583896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-registry-tls\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.616967 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.698513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/000cbac1-418f-401c-8b2c-d54e702657a3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.698686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-registry-tls\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.698945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-bound-sa-token\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.700110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/000cbac1-418f-401c-8b2c-d54e702657a3-trusted-ca\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.700143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/000cbac1-418f-401c-8b2c-d54e702657a3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.700172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sw77\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-kube-api-access-6sw77\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.699903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/000cbac1-418f-401c-8b2c-d54e702657a3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.701191 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/000cbac1-418f-401c-8b2c-d54e702657a3-registry-certificates\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.701271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/000cbac1-418f-401c-8b2c-d54e702657a3-trusted-ca\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.702933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/000cbac1-418f-401c-8b2c-d54e702657a3-registry-certificates\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.705949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/000cbac1-418f-401c-8b2c-d54e702657a3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.706317 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-registry-tls\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.723559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-bound-sa-token\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.724007 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sw77\" (UniqueName: \"kubernetes.io/projected/000cbac1-418f-401c-8b2c-d54e702657a3-kube-api-access-6sw77\") pod \"image-registry-66df7c8f76-v24bc\" (UID: \"000cbac1-418f-401c-8b2c-d54e702657a3\") " pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:24 crc kubenswrapper[4717]: I0221 21:52:24.805568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:25 crc kubenswrapper[4717]: I0221 21:52:25.249515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-v24bc"] Feb 21 21:52:25 crc kubenswrapper[4717]: W0221 21:52:25.253469 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000cbac1_418f_401c_8b2c_d54e702657a3.slice/crio-20632b7cdc7d76bd9bffe6f34028aaad79691f201704bf0efa9bcab09bf8277e WatchSource:0}: Error finding container 20632b7cdc7d76bd9bffe6f34028aaad79691f201704bf0efa9bcab09bf8277e: Status 404 returned error can't find the container with id 20632b7cdc7d76bd9bffe6f34028aaad79691f201704bf0efa9bcab09bf8277e Feb 21 21:52:26 crc kubenswrapper[4717]: I0221 21:52:26.025432 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" event={"ID":"000cbac1-418f-401c-8b2c-d54e702657a3","Type":"ContainerStarted","Data":"2416dd704834efa525dd259a6a558103bfabffc0c6c1951711c234915e9e243b"} Feb 21 21:52:26 crc kubenswrapper[4717]: I0221 21:52:26.025797 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" event={"ID":"000cbac1-418f-401c-8b2c-d54e702657a3","Type":"ContainerStarted","Data":"20632b7cdc7d76bd9bffe6f34028aaad79691f201704bf0efa9bcab09bf8277e"} Feb 21 21:52:26 crc kubenswrapper[4717]: I0221 21:52:26.027026 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:39 crc kubenswrapper[4717]: I0221 21:52:39.063306 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:52:39 crc kubenswrapper[4717]: I0221 21:52:39.063956 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:52:44 crc kubenswrapper[4717]: I0221 21:52:44.814434 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" Feb 21 21:52:44 crc kubenswrapper[4717]: I0221 21:52:44.848203 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-v24bc" podStartSLOduration=20.848174699 podStartE2EDuration="20.848174699s" podCreationTimestamp="2026-02-21 21:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:52:26.058361733 +0000 UTC m=+360.839895365" watchObservedRunningTime="2026-02-21 21:52:44.848174699 +0000 UTC m=+379.629708351" Feb 21 21:52:44 crc kubenswrapper[4717]: I0221 21:52:44.898069 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqr6g"] Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.063318 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.064126 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.064192 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.065237 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1592279668470423b045cbb5b5e5ff0c27879fab6c9b2573e402c21a013af59"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.065326 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://f1592279668470423b045cbb5b5e5ff0c27879fab6c9b2573e402c21a013af59" gracePeriod=600 Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.319685 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="f1592279668470423b045cbb5b5e5ff0c27879fab6c9b2573e402c21a013af59" exitCode=0 Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.319830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"f1592279668470423b045cbb5b5e5ff0c27879fab6c9b2573e402c21a013af59"} Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.320512 4717 scope.go:117] "RemoveContainer" containerID="d4084ad77fbd48700228a07cf9b368bc2a8fb4f3b65222c7b31d74958eb4425b" Feb 21 21:53:09 crc kubenswrapper[4717]: I0221 21:53:09.962690 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" podUID="72ebb725-29ae-4902-9b6b-6258039bb6c0" containerName="registry" containerID="cri-o://38d3ac1f0a3cd7335d2fc7b1e79f0b101179bcc8a2bfb416f9928edccbac5621" gracePeriod=30 Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.329210 4717 generic.go:334] "Generic (PLEG): container finished" podID="72ebb725-29ae-4902-9b6b-6258039bb6c0" containerID="38d3ac1f0a3cd7335d2fc7b1e79f0b101179bcc8a2bfb416f9928edccbac5621" exitCode=0 Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.329320 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" event={"ID":"72ebb725-29ae-4902-9b6b-6258039bb6c0","Type":"ContainerDied","Data":"38d3ac1f0a3cd7335d2fc7b1e79f0b101179bcc8a2bfb416f9928edccbac5621"} Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.332382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"590a8ade18d9099df0dbb922a2c22739aef34b874d1adc46c6c79c7dc49ef4a7"} Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.475584 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.555678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-trusted-ca\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.555765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72ebb725-29ae-4902-9b6b-6258039bb6c0-ca-trust-extracted\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.555796 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-tls\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.555819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-certificates\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.555854 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72ebb725-29ae-4902-9b6b-6258039bb6c0-installation-pull-secrets\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.556035 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.556060 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jllh4\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-kube-api-access-jllh4\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.556082 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-bound-sa-token\") pod \"72ebb725-29ae-4902-9b6b-6258039bb6c0\" (UID: \"72ebb725-29ae-4902-9b6b-6258039bb6c0\") " Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.557757 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.558031 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.565570 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-kube-api-access-jllh4" (OuterVolumeSpecName: "kube-api-access-jllh4") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "kube-api-access-jllh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.566973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.567076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ebb725-29ae-4902-9b6b-6258039bb6c0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.567692 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.569325 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.583909 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ebb725-29ae-4902-9b6b-6258039bb6c0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "72ebb725-29ae-4902-9b6b-6258039bb6c0" (UID: "72ebb725-29ae-4902-9b6b-6258039bb6c0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.657726 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jllh4\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-kube-api-access-jllh4\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.657952 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.658085 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.658201 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72ebb725-29ae-4902-9b6b-6258039bb6c0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.658320 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.658429 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72ebb725-29ae-4902-9b6b-6258039bb6c0-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:10 crc kubenswrapper[4717]: I0221 21:53:10.658537 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72ebb725-29ae-4902-9b6b-6258039bb6c0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 21 21:53:11 crc kubenswrapper[4717]: I0221 21:53:11.342074 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" event={"ID":"72ebb725-29ae-4902-9b6b-6258039bb6c0","Type":"ContainerDied","Data":"a803ead3efbcf01fa35d118c075edaf7fcdbf9fcfbe2bb5132c2dca34bd7b410"} Feb 21 21:53:11 crc kubenswrapper[4717]: I0221 21:53:11.342149 4717 scope.go:117] "RemoveContainer" containerID="38d3ac1f0a3cd7335d2fc7b1e79f0b101179bcc8a2bfb416f9928edccbac5621" Feb 21 21:53:11 crc kubenswrapper[4717]: I0221 21:53:11.342181 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wqr6g" Feb 21 21:53:11 crc kubenswrapper[4717]: I0221 21:53:11.395501 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqr6g"] Feb 21 21:53:11 crc kubenswrapper[4717]: I0221 21:53:11.402525 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wqr6g"] Feb 21 21:53:11 crc kubenswrapper[4717]: I0221 21:53:11.989231 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ebb725-29ae-4902-9b6b-6258039bb6c0" path="/var/lib/kubelet/pods/72ebb725-29ae-4902-9b6b-6258039bb6c0/volumes" Feb 21 21:55:09 crc kubenswrapper[4717]: I0221 21:55:09.063252 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:55:09 crc kubenswrapper[4717]: I0221 21:55:09.063892 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.932710 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d"] Feb 21 21:55:27 crc kubenswrapper[4717]: E0221 21:55:27.933310 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ebb725-29ae-4902-9b6b-6258039bb6c0" containerName="registry" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.933322 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ebb725-29ae-4902-9b6b-6258039bb6c0" containerName="registry" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.933409 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ebb725-29ae-4902-9b6b-6258039bb6c0" containerName="registry" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.933794 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.935411 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.935610 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z5lcq" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.936352 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.940943 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-tmtcj"] Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.941739 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tmtcj" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.946608 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d"] Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.957283 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tmtcj"] Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.962627 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2h5j8" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.969435 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vkn9j"] Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.970208 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.972296 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-69j27" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.975670 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8968v\" (UniqueName: \"kubernetes.io/projected/36b94939-dd01-40b9-a23b-6408a2cd36e8-kube-api-access-8968v\") pod \"cert-manager-858654f9db-tmtcj\" (UID: \"36b94939-dd01-40b9-a23b-6408a2cd36e8\") " pod="cert-manager/cert-manager-858654f9db-tmtcj" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.975744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv84n\" (UniqueName: \"kubernetes.io/projected/11020ca7-c11f-4b18-a5f1-e1ab7bc148d2-kube-api-access-mv84n\") pod \"cert-manager-cainjector-cf98fcc89-pwd5d\" (UID: \"11020ca7-c11f-4b18-a5f1-e1ab7bc148d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" Feb 21 21:55:27 crc kubenswrapper[4717]: I0221 21:55:27.988104 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vkn9j"] Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.076441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfdmr\" (UniqueName: \"kubernetes.io/projected/86b0523d-fba9-48d5-aaa6-33682ae7336a-kube-api-access-rfdmr\") pod \"cert-manager-webhook-687f57d79b-vkn9j\" (UID: \"86b0523d-fba9-48d5-aaa6-33682ae7336a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.076709 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv84n\" (UniqueName: \"kubernetes.io/projected/11020ca7-c11f-4b18-a5f1-e1ab7bc148d2-kube-api-access-mv84n\") pod \"cert-manager-cainjector-cf98fcc89-pwd5d\" (UID: \"11020ca7-c11f-4b18-a5f1-e1ab7bc148d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.076772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8968v\" (UniqueName: \"kubernetes.io/projected/36b94939-dd01-40b9-a23b-6408a2cd36e8-kube-api-access-8968v\") pod \"cert-manager-858654f9db-tmtcj\" (UID: \"36b94939-dd01-40b9-a23b-6408a2cd36e8\") " pod="cert-manager/cert-manager-858654f9db-tmtcj" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.101780 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8968v\" (UniqueName: \"kubernetes.io/projected/36b94939-dd01-40b9-a23b-6408a2cd36e8-kube-api-access-8968v\") pod \"cert-manager-858654f9db-tmtcj\" (UID: \"36b94939-dd01-40b9-a23b-6408a2cd36e8\") " pod="cert-manager/cert-manager-858654f9db-tmtcj" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.102623 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv84n\" (UniqueName: \"kubernetes.io/projected/11020ca7-c11f-4b18-a5f1-e1ab7bc148d2-kube-api-access-mv84n\") pod \"cert-manager-cainjector-cf98fcc89-pwd5d\" (UID: \"11020ca7-c11f-4b18-a5f1-e1ab7bc148d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.178397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfdmr\" (UniqueName: \"kubernetes.io/projected/86b0523d-fba9-48d5-aaa6-33682ae7336a-kube-api-access-rfdmr\") pod \"cert-manager-webhook-687f57d79b-vkn9j\" (UID: \"86b0523d-fba9-48d5-aaa6-33682ae7336a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.198782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfdmr\" (UniqueName: \"kubernetes.io/projected/86b0523d-fba9-48d5-aaa6-33682ae7336a-kube-api-access-rfdmr\") pod \"cert-manager-webhook-687f57d79b-vkn9j\" (UID: \"86b0523d-fba9-48d5-aaa6-33682ae7336a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.265875 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.278844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tmtcj" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.288575 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.565618 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vkn9j"] Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.574223 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.696139 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tmtcj"] Feb 21 21:55:28 crc kubenswrapper[4717]: W0221 21:55:28.702261 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36b94939_dd01_40b9_a23b_6408a2cd36e8.slice/crio-0247b50694edd905190ab5529bd3e3c8e4050ca92289639c3f824f5415d40439 WatchSource:0}: Error finding container 0247b50694edd905190ab5529bd3e3c8e4050ca92289639c3f824f5415d40439: Status 404 returned error can't find the container with id 0247b50694edd905190ab5529bd3e3c8e4050ca92289639c3f824f5415d40439 Feb 21 21:55:28 crc kubenswrapper[4717]: I0221 21:55:28.705561 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d"] Feb 21 21:55:28 crc kubenswrapper[4717]: W0221 21:55:28.711513 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11020ca7_c11f_4b18_a5f1_e1ab7bc148d2.slice/crio-509d3dc814508cfce02742c80a8d9d65c1ff3172b6ef660cf8d7044e3aa3daa6 WatchSource:0}: Error finding container 509d3dc814508cfce02742c80a8d9d65c1ff3172b6ef660cf8d7044e3aa3daa6: Status 404 returned error can't find the container with id 509d3dc814508cfce02742c80a8d9d65c1ff3172b6ef660cf8d7044e3aa3daa6 Feb 21 21:55:29 crc kubenswrapper[4717]: I0221 21:55:29.307512 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" event={"ID":"11020ca7-c11f-4b18-a5f1-e1ab7bc148d2","Type":"ContainerStarted","Data":"509d3dc814508cfce02742c80a8d9d65c1ff3172b6ef660cf8d7044e3aa3daa6"} Feb 21 21:55:29 crc kubenswrapper[4717]: I0221 21:55:29.310043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" event={"ID":"86b0523d-fba9-48d5-aaa6-33682ae7336a","Type":"ContainerStarted","Data":"57f54ccb990365ac816dae7a4bdf2b14cc2c13b62e0baae23bcbe902fab050be"} Feb 21 21:55:29 crc kubenswrapper[4717]: I0221 21:55:29.313243 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tmtcj" event={"ID":"36b94939-dd01-40b9-a23b-6408a2cd36e8","Type":"ContainerStarted","Data":"0247b50694edd905190ab5529bd3e3c8e4050ca92289639c3f824f5415d40439"} Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.347253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" event={"ID":"86b0523d-fba9-48d5-aaa6-33682ae7336a","Type":"ContainerStarted","Data":"bce168a6810270f5200a04e5e1954eedf8e905eb0e0b0e63fbad9a019ebe103f"} Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.347572 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.349487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tmtcj" event={"ID":"36b94939-dd01-40b9-a23b-6408a2cd36e8","Type":"ContainerStarted","Data":"266fe6a392f1978eefceeab227d3595a1be63cdde8402b97bc8ead676a9b6624"} Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.351216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" event={"ID":"11020ca7-c11f-4b18-a5f1-e1ab7bc148d2","Type":"ContainerStarted","Data":"9c7271b847d24f42310e4590fcdea103026faf9b0bd6fbfa9b6e68fbd8f7268d"} Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.374317 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" podStartSLOduration=2.83853964 podStartE2EDuration="6.374293346s" podCreationTimestamp="2026-02-21 21:55:27 +0000 UTC" firstStartedPulling="2026-02-21 21:55:28.574009203 +0000 UTC m=+543.355542825" lastFinishedPulling="2026-02-21 21:55:32.109762899 +0000 UTC m=+546.891296531" observedRunningTime="2026-02-21 21:55:33.368934499 +0000 UTC m=+548.150468151" watchObservedRunningTime="2026-02-21 21:55:33.374293346 +0000 UTC m=+548.155826998" Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.390158 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-pwd5d" podStartSLOduration=3.002494758 podStartE2EDuration="6.390134937s" podCreationTimestamp="2026-02-21 21:55:27 +0000 UTC" firstStartedPulling="2026-02-21 21:55:28.713125135 +0000 UTC m=+543.494658757" lastFinishedPulling="2026-02-21 21:55:32.100765304 +0000 UTC m=+546.882298936" observedRunningTime="2026-02-21 21:55:33.386140761 +0000 UTC m=+548.167674403" watchObservedRunningTime="2026-02-21 21:55:33.390134937 +0000 UTC m=+548.171668589" Feb 21 21:55:33 crc kubenswrapper[4717]: I0221 21:55:33.452705 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-tmtcj" podStartSLOduration=2.970801799 podStartE2EDuration="6.452686704s" podCreationTimestamp="2026-02-21 21:55:27 +0000 UTC" firstStartedPulling="2026-02-21 21:55:28.705124354 +0000 UTC m=+543.486657986" lastFinishedPulling="2026-02-21 21:55:32.187009249 +0000 UTC m=+546.968542891" observedRunningTime="2026-02-21 21:55:33.452607412 +0000 UTC m=+548.234141074" watchObservedRunningTime="2026-02-21 21:55:33.452686704 +0000 UTC m=+548.234220346" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.140561 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7ndm2"] Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.141903 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-controller" containerID="cri-o://c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.141927 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="sbdb" containerID="cri-o://a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.142103 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-node" containerID="cri-o://257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.142071 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.142172 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-acl-logging" containerID="cri-o://ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.142130 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="nbdb" containerID="cri-o://8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.142240 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="northd" containerID="cri-o://b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.237572 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" containerID="cri-o://b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" gracePeriod=30 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.296416 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vkn9j" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.389104 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovnkube-controller/3.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391043 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovn-acl-logging/0.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391437 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovn-controller/0.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391707 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" exitCode=0 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391735 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" exitCode=0 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391747 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" exitCode=0 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391757 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" exitCode=143 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391766 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" exitCode=143 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912"} Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391805 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156"} Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391817 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac"} Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391826 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563"} Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391829 4717 scope.go:117] "RemoveContainer" containerID="d72e45218efa3b08caae693c94b5b14ec13d7bb3f53bb3e75d85a9618625f3fa" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.391834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723"} Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.393235 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/2.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.395135 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/1.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.395178 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da" containerID="879e836c48c185e9832695059e5cff570b9f6ad5d09395cf0f9f6e7c2a7682b4" exitCode=2 Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.395215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerDied","Data":"879e836c48c185e9832695059e5cff570b9f6ad5d09395cf0f9f6e7c2a7682b4"} Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.395711 4717 scope.go:117] "RemoveContainer" containerID="879e836c48c185e9832695059e5cff570b9f6ad5d09395cf0f9f6e7c2a7682b4" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.395918 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bzd94_openshift-multus(d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da)\"" pod="openshift-multus/multus-bzd94" podUID="d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.527986 4717 scope.go:117] "RemoveContainer" containerID="3ed2cfeb7efa6bfd33e2935b831034faad1ff107bd112b21d7e85dc510ddc227" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.529292 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovn-acl-logging/0.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.530259 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovn-controller/0.log" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.532215 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.605519 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9hfcl"] Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.605857 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="nbdb" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.605914 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="nbdb" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.605938 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.605952 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.605966 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kubecfg-setup" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.605979 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kubecfg-setup" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.605997 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-acl-logging" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606009 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-acl-logging" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606030 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606043 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606062 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606073 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606086 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-node" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606098 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-node" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606118 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-ovn-metrics" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606130 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-ovn-metrics" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606166 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="northd" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606178 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="northd" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606195 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606209 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606224 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="sbdb" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606236 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="sbdb" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606406 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="sbdb" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606422 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606436 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovn-acl-logging" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606451 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-node" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606466 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="northd" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606480 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606494 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606507 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="kube-rbac-proxy-ovn-metrics" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606524 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606541 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="nbdb" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606555 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.606731 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606747 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.606945 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: E0221 21:55:38.607136 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.607157 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerName="ovnkube-controller" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.609759 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovn-node-metrics-cert\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9jm\" (UniqueName: \"kubernetes.io/projected/755af0b9-c1d3-4c8f-a654-648af3256aa4-kube-api-access-hc9jm\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625448 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625485 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovnkube-script-lib\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625518 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovnkube-config\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625562 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-run-netns\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-var-lib-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-run-ovn-kubernetes\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625695 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-node-log\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625726 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-slash\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625773 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-systemd-units\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-systemd\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625955 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-cni-bin\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.625983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-env-overrides\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.626013 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-log-socket\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.626113 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-ovn\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.626151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-kubelet\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.626235 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-cni-netd\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.626291 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-etc-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727703 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-var-lib-openvswitch\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-openvswitch\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727778 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78fqm\" (UniqueName: \"kubernetes.io/projected/f6a10be9-c25d-42c3-9a4f-e2397cc64852-kube-api-access-78fqm\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727804 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-slash\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727841 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-netd\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-kubelet\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727925 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.727954 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-systemd\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728041 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-config\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728044 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-slash" (OuterVolumeSpecName: "host-slash") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728087 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-script-lib\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728127 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-ovn\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728066 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728143 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728216 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728197 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-etc-openvswitch\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-log-socket\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728250 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728456 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-env-overrides\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728622 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-ovn-kubernetes\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728685 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-systemd-units\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728748 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovn-node-metrics-cert\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728757 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728811 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-netns\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728809 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728813 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728943 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-node-log\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728850 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728995 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-bin\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728999 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-node-log" (OuterVolumeSpecName: "node-log") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\" (UID: \"f6a10be9-c25d-42c3-9a4f-e2397cc64852\") " Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.728937 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729046 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729245 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-log-socket" (OuterVolumeSpecName: "log-socket") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729232 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729306 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-systemd-units\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-systemd-units\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729387 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729474 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-systemd\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-cni-bin\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729563 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-env-overrides\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729599 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-log-socket\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-cni-bin\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-systemd\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-log-socket\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-ovn\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-kubelet\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729801 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-cni-netd\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-etc-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729909 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovn-node-metrics-cert\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.729953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9jm\" (UniqueName: \"kubernetes.io/projected/755af0b9-c1d3-4c8f-a654-648af3256aa4-kube-api-access-hc9jm\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730018 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730061 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovnkube-script-lib\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730098 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovnkube-config\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-run-netns\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730226 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-var-lib-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-run-ovn-kubernetes\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-node-log\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730324 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-slash\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730421 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730441 4717 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730462 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730479 4717 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-node-log\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730497 4717 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730515 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730534 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730551 4717 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730568 4717 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-slash\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730587 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730590 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-env-overrides\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730636 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-slash\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730603 4717 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730678 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731707 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovnkube-script-lib\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731730 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovnkube-config\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-run-netns\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-run-ovn\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731829 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-var-lib-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-run-ovn-kubernetes\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731913 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-etc-openvswitch\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.730699 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731946 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-node-log\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731962 4717 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731986 4717 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.732002 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-cni-netd\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.732005 4717 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-log-socket\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.732070 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a10be9-c25d-42c3-9a4f-e2397cc64852-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.731957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/755af0b9-c1d3-4c8f-a654-648af3256aa4-host-kubelet\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.736079 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a10be9-c25d-42c3-9a4f-e2397cc64852-kube-api-access-78fqm" (OuterVolumeSpecName: "kube-api-access-78fqm") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "kube-api-access-78fqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.736108 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.738567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/755af0b9-c1d3-4c8f-a654-648af3256aa4-ovn-node-metrics-cert\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.743189 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f6a10be9-c25d-42c3-9a4f-e2397cc64852" (UID: "f6a10be9-c25d-42c3-9a4f-e2397cc64852"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.760949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9jm\" (UniqueName: \"kubernetes.io/projected/755af0b9-c1d3-4c8f-a654-648af3256aa4-kube-api-access-hc9jm\") pod \"ovnkube-node-9hfcl\" (UID: \"755af0b9-c1d3-4c8f-a654-648af3256aa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.834480 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a10be9-c25d-42c3-9a4f-e2397cc64852-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.834563 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78fqm\" (UniqueName: \"kubernetes.io/projected/f6a10be9-c25d-42c3-9a4f-e2397cc64852-kube-api-access-78fqm\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.834589 4717 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a10be9-c25d-42c3-9a4f-e2397cc64852-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 21 21:55:38 crc kubenswrapper[4717]: I0221 21:55:38.928226 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.063090 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.063222 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.405280 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/2.log" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.410772 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovn-acl-logging/0.log" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.411580 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7ndm2_f6a10be9-c25d-42c3-9a4f-e2397cc64852/ovn-controller/0.log" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412283 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" exitCode=0 Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412324 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" exitCode=0 Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412343 4717 generic.go:334] "Generic (PLEG): container finished" podID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" containerID="b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" exitCode=0 Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887"} Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412386 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09"} Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf"} Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412449 4717 scope.go:117] "RemoveContainer" containerID="b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.412449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7ndm2" event={"ID":"f6a10be9-c25d-42c3-9a4f-e2397cc64852","Type":"ContainerDied","Data":"25c7fbb54410bde86b43a7110b201a47b7e85333bda8777ca328f34602963bd6"} Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.415283 4717 generic.go:334] "Generic (PLEG): container finished" podID="755af0b9-c1d3-4c8f-a654-648af3256aa4" containerID="068524b69d7c18e7f03cc3e6fec4bd1205535d602475dab1330f42a2e2e54ab3" exitCode=0 Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.415333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerDied","Data":"068524b69d7c18e7f03cc3e6fec4bd1205535d602475dab1330f42a2e2e54ab3"} Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.415366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"0d64ce4bfea88430aa6ca6349af60cb552aab6e3291ea6136ce6e06532a7c89c"} Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.471463 4717 scope.go:117] "RemoveContainer" containerID="a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.527592 4717 scope.go:117] "RemoveContainer" containerID="8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.613298 4717 scope.go:117] "RemoveContainer" containerID="b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.614010 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7ndm2"] Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.617722 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7ndm2"] Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.627044 4717 scope.go:117] "RemoveContainer" containerID="36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.649272 4717 scope.go:117] "RemoveContainer" containerID="257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.664714 4717 scope.go:117] "RemoveContainer" containerID="ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.693390 4717 scope.go:117] "RemoveContainer" containerID="c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.714437 4717 scope.go:117] "RemoveContainer" containerID="6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.737980 4717 scope.go:117] "RemoveContainer" containerID="b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.738362 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": container with ID starting with b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912 not found: ID does not exist" containerID="b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.738403 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912"} err="failed to get container status \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": rpc error: code = NotFound desc = could not find container \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": container with ID starting with b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.738430 4717 scope.go:117] "RemoveContainer" containerID="a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.738941 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": container with ID starting with a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887 not found: ID does not exist" containerID="a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.738971 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887"} err="failed to get container status \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": rpc error: code = NotFound desc = could not find container \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": container with ID starting with a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.738992 4717 scope.go:117] "RemoveContainer" containerID="8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.739400 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": container with ID starting with 8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09 not found: ID does not exist" containerID="8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.739504 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09"} err="failed to get container status \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": rpc error: code = NotFound desc = could not find container \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": container with ID starting with 8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.739582 4717 scope.go:117] "RemoveContainer" containerID="b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.739918 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": container with ID starting with b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf not found: ID does not exist" containerID="b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.739957 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf"} err="failed to get container status \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": rpc error: code = NotFound desc = could not find container \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": container with ID starting with b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.739981 4717 scope.go:117] "RemoveContainer" containerID="36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.740363 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": container with ID starting with 36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156 not found: ID does not exist" containerID="36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.740417 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156"} err="failed to get container status \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": rpc error: code = NotFound desc = could not find container \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": container with ID starting with 36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.740453 4717 scope.go:117] "RemoveContainer" containerID="257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.740795 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": container with ID starting with 257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac not found: ID does not exist" containerID="257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.740835 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac"} err="failed to get container status \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": rpc error: code = NotFound desc = could not find container \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": container with ID starting with 257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.740882 4717 scope.go:117] "RemoveContainer" containerID="ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.741154 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": container with ID starting with ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563 not found: ID does not exist" containerID="ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.741193 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563"} err="failed to get container status \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": rpc error: code = NotFound desc = could not find container \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": container with ID starting with ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.741213 4717 scope.go:117] "RemoveContainer" containerID="c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.741534 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": container with ID starting with c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723 not found: ID does not exist" containerID="c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.741558 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723"} err="failed to get container status \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": rpc error: code = NotFound desc = could not find container \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": container with ID starting with c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.741573 4717 scope.go:117] "RemoveContainer" containerID="6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c" Feb 21 21:55:39 crc kubenswrapper[4717]: E0221 21:55:39.741848 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": container with ID starting with 6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c not found: ID does not exist" containerID="6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.741885 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c"} err="failed to get container status \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": rpc error: code = NotFound desc = could not find container \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": container with ID starting with 6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.741901 4717 scope.go:117] "RemoveContainer" containerID="b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.742150 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912"} err="failed to get container status \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": rpc error: code = NotFound desc = could not find container \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": container with ID starting with b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.742182 4717 scope.go:117] "RemoveContainer" containerID="a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.742435 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887"} err="failed to get container status \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": rpc error: code = NotFound desc = could not find container \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": container with ID starting with a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.742454 4717 scope.go:117] "RemoveContainer" containerID="8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.742790 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09"} err="failed to get container status \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": rpc error: code = NotFound desc = could not find container \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": container with ID starting with 8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.742823 4717 scope.go:117] "RemoveContainer" containerID="b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.743176 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf"} err="failed to get container status \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": rpc error: code = NotFound desc = could not find container \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": container with ID starting with b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.743196 4717 scope.go:117] "RemoveContainer" containerID="36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.743526 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156"} err="failed to get container status \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": rpc error: code = NotFound desc = could not find container \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": container with ID starting with 36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.743557 4717 scope.go:117] "RemoveContainer" containerID="257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.743945 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac"} err="failed to get container status \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": rpc error: code = NotFound desc = could not find container \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": container with ID starting with 257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.743987 4717 scope.go:117] "RemoveContainer" containerID="ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.744293 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563"} err="failed to get container status \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": rpc error: code = NotFound desc = could not find container \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": container with ID starting with ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.744313 4717 scope.go:117] "RemoveContainer" containerID="c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.744668 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723"} err="failed to get container status \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": rpc error: code = NotFound desc = could not find container \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": container with ID starting with c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.744694 4717 scope.go:117] "RemoveContainer" containerID="6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.745011 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c"} err="failed to get container status \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": rpc error: code = NotFound desc = could not find container \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": container with ID starting with 6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.745030 4717 scope.go:117] "RemoveContainer" containerID="b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.745384 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912"} err="failed to get container status \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": rpc error: code = NotFound desc = could not find container \"b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912\": container with ID starting with b34e0d7bd56dcb9dd9d73118f5b440cb6dc947f0a1787000d87feefe12968912 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.745408 4717 scope.go:117] "RemoveContainer" containerID="a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.745743 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887"} err="failed to get container status \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": rpc error: code = NotFound desc = could not find container \"a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887\": container with ID starting with a53ffb1cd6401018fb52ec754bef737fb5d45fa871e0ce2fef71e38aad25a887 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.745769 4717 scope.go:117] "RemoveContainer" containerID="8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.746054 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09"} err="failed to get container status \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": rpc error: code = NotFound desc = could not find container \"8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09\": container with ID starting with 8bec815fd64b04e4b0d4688862acce0932f427b63b0028908e4fe61b14226b09 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.746097 4717 scope.go:117] "RemoveContainer" containerID="b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.746613 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf"} err="failed to get container status \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": rpc error: code = NotFound desc = could not find container \"b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf\": container with ID starting with b85d07ed414d577bd09a5a1afe19ec7c578cf1f598d3cb3b69430efe8d2920cf not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.746657 4717 scope.go:117] "RemoveContainer" containerID="36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.747313 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156"} err="failed to get container status \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": rpc error: code = NotFound desc = could not find container \"36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156\": container with ID starting with 36b8829ea34f449218a46e4bc163044a28e734989ba73c34bf8eec7c494a8156 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.747343 4717 scope.go:117] "RemoveContainer" containerID="257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.747807 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac"} err="failed to get container status \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": rpc error: code = NotFound desc = could not find container \"257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac\": container with ID starting with 257f5b325375443eff68bb750c41a34f2162644654441a64ccaa9b4811e41dac not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.747834 4717 scope.go:117] "RemoveContainer" containerID="ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.748264 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563"} err="failed to get container status \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": rpc error: code = NotFound desc = could not find container \"ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563\": container with ID starting with ac4245aa93e118305640d343bca48a905b0d1070fca269203c665a6d29215563 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.748292 4717 scope.go:117] "RemoveContainer" containerID="c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.748647 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723"} err="failed to get container status \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": rpc error: code = NotFound desc = could not find container \"c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723\": container with ID starting with c7535ebb223151da7962ee3e212290f52b06a4af9b9d9bf4c048d3a7a2036723 not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.748677 4717 scope.go:117] "RemoveContainer" containerID="6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.748992 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c"} err="failed to get container status \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": rpc error: code = NotFound desc = could not find container \"6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c\": container with ID starting with 6fcc6d7ad30b8a6f174cfaa267e8b9e4caa586a06e7bb8834b2091a7b10f579c not found: ID does not exist" Feb 21 21:55:39 crc kubenswrapper[4717]: I0221 21:55:39.987306 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a10be9-c25d-42c3-9a4f-e2397cc64852" path="/var/lib/kubelet/pods/f6a10be9-c25d-42c3-9a4f-e2397cc64852/volumes" Feb 21 21:55:40 crc kubenswrapper[4717]: I0221 21:55:40.433269 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"a9fa11c0a5b1e8a58d3589bba3a47996d7706a4ba6faf0f26cfd3642bb1e2f8e"} Feb 21 21:55:40 crc kubenswrapper[4717]: I0221 21:55:40.433916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"a70f3399b4794d6c63f3414257a9b3ca9641da3ecb249a51581e024332c9b245"} Feb 21 21:55:40 crc kubenswrapper[4717]: I0221 21:55:40.433946 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"61d87972593a4f654cdf2485694a62683ee3907ac94e682b21ed573944706ed5"} Feb 21 21:55:40 crc kubenswrapper[4717]: I0221 21:55:40.433963 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"8185895cf03cdebe645bbb51e9d5c9a25f977d6c75b776ade4cd96a7ca44d0f8"} Feb 21 21:55:40 crc kubenswrapper[4717]: I0221 21:55:40.433980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"ed66ca004061a09f06b37b0c710099f25fe9224efd780b20532b62a26a4bbc79"} Feb 21 21:55:40 crc kubenswrapper[4717]: I0221 21:55:40.433997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"0e5b984d64a86018d41e05e388e698e6a0b482b680bc1173f42f3bf8fa1792e7"} Feb 21 21:55:43 crc kubenswrapper[4717]: I0221 21:55:43.464019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"eeb8bf7b572fcd338e6c9f4d25dc987c6aea7de3e5fdc931786589447e37e67b"} Feb 21 21:55:45 crc kubenswrapper[4717]: I0221 21:55:45.484939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" event={"ID":"755af0b9-c1d3-4c8f-a654-648af3256aa4","Type":"ContainerStarted","Data":"e89462bddb711092c08796a2d142d90e45a232347ca7105dcb7bad38bc69c1a1"} Feb 21 21:55:45 crc kubenswrapper[4717]: I0221 21:55:45.485799 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:45 crc kubenswrapper[4717]: I0221 21:55:45.521250 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:45 crc kubenswrapper[4717]: I0221 21:55:45.538284 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" podStartSLOduration=7.538255232 podStartE2EDuration="7.538255232s" podCreationTimestamp="2026-02-21 21:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:55:45.533610641 +0000 UTC m=+560.315144303" watchObservedRunningTime="2026-02-21 21:55:45.538255232 +0000 UTC m=+560.319788884" Feb 21 21:55:46 crc kubenswrapper[4717]: I0221 21:55:46.492596 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:46 crc kubenswrapper[4717]: I0221 21:55:46.492672 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:46 crc kubenswrapper[4717]: I0221 21:55:46.540462 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:55:50 crc kubenswrapper[4717]: I0221 21:55:50.976978 4717 scope.go:117] "RemoveContainer" containerID="879e836c48c185e9832695059e5cff570b9f6ad5d09395cf0f9f6e7c2a7682b4" Feb 21 21:55:50 crc kubenswrapper[4717]: E0221 21:55:50.978011 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bzd94_openshift-multus(d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da)\"" pod="openshift-multus/multus-bzd94" podUID="d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da" Feb 21 21:56:03 crc kubenswrapper[4717]: I0221 21:56:03.977278 4717 scope.go:117] "RemoveContainer" containerID="879e836c48c185e9832695059e5cff570b9f6ad5d09395cf0f9f6e7c2a7682b4" Feb 21 21:56:04 crc kubenswrapper[4717]: I0221 21:56:04.624581 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bzd94_d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da/kube-multus/2.log" Feb 21 21:56:04 crc kubenswrapper[4717]: I0221 21:56:04.625717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bzd94" event={"ID":"d8a3d061-4ed4-4ea5-83d7-d2a74b8bf5da","Type":"ContainerStarted","Data":"07650ced308bc9322704d3143e0e3936e0f5f131b13fe56ede5d3fe61b22265e"} Feb 21 21:56:08 crc kubenswrapper[4717]: I0221 21:56:08.974081 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9hfcl" Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.065117 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.065205 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.065274 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.066119 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"590a8ade18d9099df0dbb922a2c22739aef34b874d1adc46c6c79c7dc49ef4a7"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.066230 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://590a8ade18d9099df0dbb922a2c22739aef34b874d1adc46c6c79c7dc49ef4a7" gracePeriod=600 Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.667053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"590a8ade18d9099df0dbb922a2c22739aef34b874d1adc46c6c79c7dc49ef4a7"} Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.667479 4717 scope.go:117] "RemoveContainer" containerID="f1592279668470423b045cbb5b5e5ff0c27879fab6c9b2573e402c21a013af59" Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.666995 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="590a8ade18d9099df0dbb922a2c22739aef34b874d1adc46c6c79c7dc49ef4a7" exitCode=0 Feb 21 21:56:09 crc kubenswrapper[4717]: I0221 21:56:09.667698 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"38f86864c8d1bb2ef635ae7b8573c0d40328b4d39ce0f3640268f93045f23c56"} Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.571455 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp"] Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.573723 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.576365 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.581120 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp"] Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.612247 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k89f\" (UniqueName: \"kubernetes.io/projected/eb912310-4451-4fc6-b05b-675b6bcfff59-kube-api-access-7k89f\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.612520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.612565 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.713185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k89f\" (UniqueName: \"kubernetes.io/projected/eb912310-4451-4fc6-b05b-675b6bcfff59-kube-api-access-7k89f\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.713258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.713352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.714128 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.714374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.750253 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k89f\" (UniqueName: \"kubernetes.io/projected/eb912310-4451-4fc6-b05b-675b6bcfff59-kube-api-access-7k89f\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:19 crc kubenswrapper[4717]: I0221 21:56:19.907979 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:20 crc kubenswrapper[4717]: I0221 21:56:20.243805 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp"] Feb 21 21:56:20 crc kubenswrapper[4717]: I0221 21:56:20.768697 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerID="9ec333b1956acec58b93c94371a09d75a119fc1804c06149b29ec9d2a1aa63a3" exitCode=0 Feb 21 21:56:20 crc kubenswrapper[4717]: I0221 21:56:20.768765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" event={"ID":"eb912310-4451-4fc6-b05b-675b6bcfff59","Type":"ContainerDied","Data":"9ec333b1956acec58b93c94371a09d75a119fc1804c06149b29ec9d2a1aa63a3"} Feb 21 21:56:20 crc kubenswrapper[4717]: I0221 21:56:20.768907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" event={"ID":"eb912310-4451-4fc6-b05b-675b6bcfff59","Type":"ContainerStarted","Data":"dc77d4e28599fc1e3359fd1e4f2031b3d7b28ce0ae09d520416d8e1a145fb759"} Feb 21 21:56:22 crc kubenswrapper[4717]: I0221 21:56:22.787040 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerID="c821092f02eb80abf776f949a672cda0b62bda1c867cfedb535ae8d0154f9193" exitCode=0 Feb 21 21:56:22 crc kubenswrapper[4717]: I0221 21:56:22.787184 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" event={"ID":"eb912310-4451-4fc6-b05b-675b6bcfff59","Type":"ContainerDied","Data":"c821092f02eb80abf776f949a672cda0b62bda1c867cfedb535ae8d0154f9193"} Feb 21 21:56:23 crc kubenswrapper[4717]: I0221 21:56:23.799404 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerID="aa54116fd1b7bcea6d4c2d10812e64af59a423d61b3a8f17208528c984afd161" exitCode=0 Feb 21 21:56:23 crc kubenswrapper[4717]: I0221 21:56:23.799500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" event={"ID":"eb912310-4451-4fc6-b05b-675b6bcfff59","Type":"ContainerDied","Data":"aa54116fd1b7bcea6d4c2d10812e64af59a423d61b3a8f17208528c984afd161"} Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.121774 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.205943 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k89f\" (UniqueName: \"kubernetes.io/projected/eb912310-4451-4fc6-b05b-675b6bcfff59-kube-api-access-7k89f\") pod \"eb912310-4451-4fc6-b05b-675b6bcfff59\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.206072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-util\") pod \"eb912310-4451-4fc6-b05b-675b6bcfff59\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.206144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-bundle\") pod \"eb912310-4451-4fc6-b05b-675b6bcfff59\" (UID: \"eb912310-4451-4fc6-b05b-675b6bcfff59\") " Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.208508 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-bundle" (OuterVolumeSpecName: "bundle") pod "eb912310-4451-4fc6-b05b-675b6bcfff59" (UID: "eb912310-4451-4fc6-b05b-675b6bcfff59"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.216482 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb912310-4451-4fc6-b05b-675b6bcfff59-kube-api-access-7k89f" (OuterVolumeSpecName: "kube-api-access-7k89f") pod "eb912310-4451-4fc6-b05b-675b6bcfff59" (UID: "eb912310-4451-4fc6-b05b-675b6bcfff59"). InnerVolumeSpecName "kube-api-access-7k89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.238214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-util" (OuterVolumeSpecName: "util") pod "eb912310-4451-4fc6-b05b-675b6bcfff59" (UID: "eb912310-4451-4fc6-b05b-675b6bcfff59"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.309141 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k89f\" (UniqueName: \"kubernetes.io/projected/eb912310-4451-4fc6-b05b-675b6bcfff59-kube-api-access-7k89f\") on node \"crc\" DevicePath \"\"" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.309186 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-util\") on node \"crc\" DevicePath \"\"" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.309204 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb912310-4451-4fc6-b05b-675b6bcfff59-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.821994 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.821998 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp" event={"ID":"eb912310-4451-4fc6-b05b-675b6bcfff59","Type":"ContainerDied","Data":"dc77d4e28599fc1e3359fd1e4f2031b3d7b28ce0ae09d520416d8e1a145fb759"} Feb 21 21:56:25 crc kubenswrapper[4717]: I0221 21:56:25.822202 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc77d4e28599fc1e3359fd1e4f2031b3d7b28ce0ae09d520416d8e1a145fb759" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.372033 4717 scope.go:117] "RemoveContainer" containerID="8cff732a3db53445e1fc44f977b3631c8cd4c2f5bf1cdfa1c8b21a54e531b87f" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.904648 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-6z5b8"] Feb 21 21:56:26 crc kubenswrapper[4717]: E0221 21:56:26.905072 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="extract" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.905083 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="extract" Feb 21 21:56:26 crc kubenswrapper[4717]: E0221 21:56:26.905097 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="pull" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.905102 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="pull" Feb 21 21:56:26 crc kubenswrapper[4717]: E0221 21:56:26.905121 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="util" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.905127 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="util" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.905212 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb912310-4451-4fc6-b05b-675b6bcfff59" containerName="extract" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.905536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.907508 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.908027 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-z9mbg" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.908345 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.920958 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-6z5b8"] Feb 21 21:56:26 crc kubenswrapper[4717]: I0221 21:56:26.935902 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs484\" (UniqueName: \"kubernetes.io/projected/4074f791-7fb0-4f78-96e7-e926cebfab66-kube-api-access-rs484\") pod \"nmstate-operator-694c9596b7-6z5b8\" (UID: \"4074f791-7fb0-4f78-96e7-e926cebfab66\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" Feb 21 21:56:27 crc kubenswrapper[4717]: I0221 21:56:27.037505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs484\" (UniqueName: \"kubernetes.io/projected/4074f791-7fb0-4f78-96e7-e926cebfab66-kube-api-access-rs484\") pod \"nmstate-operator-694c9596b7-6z5b8\" (UID: \"4074f791-7fb0-4f78-96e7-e926cebfab66\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" Feb 21 21:56:27 crc kubenswrapper[4717]: I0221 21:56:27.058927 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs484\" (UniqueName: \"kubernetes.io/projected/4074f791-7fb0-4f78-96e7-e926cebfab66-kube-api-access-rs484\") pod \"nmstate-operator-694c9596b7-6z5b8\" (UID: \"4074f791-7fb0-4f78-96e7-e926cebfab66\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" Feb 21 21:56:27 crc kubenswrapper[4717]: I0221 21:56:27.220263 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" Feb 21 21:56:27 crc kubenswrapper[4717]: I0221 21:56:27.434279 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-6z5b8"] Feb 21 21:56:27 crc kubenswrapper[4717]: W0221 21:56:27.440451 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4074f791_7fb0_4f78_96e7_e926cebfab66.slice/crio-e9b0f3543c8e9b3fad8cfe663b1352e33b39da36dcb954f5b5b88b3b35475f93 WatchSource:0}: Error finding container e9b0f3543c8e9b3fad8cfe663b1352e33b39da36dcb954f5b5b88b3b35475f93: Status 404 returned error can't find the container with id e9b0f3543c8e9b3fad8cfe663b1352e33b39da36dcb954f5b5b88b3b35475f93 Feb 21 21:56:27 crc kubenswrapper[4717]: I0221 21:56:27.834452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" event={"ID":"4074f791-7fb0-4f78-96e7-e926cebfab66","Type":"ContainerStarted","Data":"e9b0f3543c8e9b3fad8cfe663b1352e33b39da36dcb954f5b5b88b3b35475f93"} Feb 21 21:56:29 crc kubenswrapper[4717]: I0221 21:56:29.850899 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" event={"ID":"4074f791-7fb0-4f78-96e7-e926cebfab66","Type":"ContainerStarted","Data":"04137b3b633800e5ae0a7fd9bbf2fd237b4f58bccaa85a781df51c9c599a8aba"} Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.888742 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-6z5b8" podStartSLOduration=2.7094264949999998 podStartE2EDuration="4.888720301s" podCreationTimestamp="2026-02-21 21:56:26 +0000 UTC" firstStartedPulling="2026-02-21 21:56:27.445151763 +0000 UTC m=+602.226685395" lastFinishedPulling="2026-02-21 21:56:29.624445579 +0000 UTC m=+604.405979201" observedRunningTime="2026-02-21 21:56:29.890089258 +0000 UTC m=+604.671622910" watchObservedRunningTime="2026-02-21 21:56:30.888720301 +0000 UTC m=+605.670253933" Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.890920 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4"] Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.892211 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.896661 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ndzn6" Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.906091 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4"] Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.918823 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4"] Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.919796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.921781 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.940630 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wls9d"] Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.941509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:30 crc kubenswrapper[4717]: I0221 21:56:30.950393 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4"] Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzgcf\" (UniqueName: \"kubernetes.io/projected/3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1-kube-api-access-bzgcf\") pod \"nmstate-metrics-58c85c668d-f4sk4\" (UID: \"3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006726 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/68767c95-c372-4de6-bab2-8eaae4cb37b3-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-nmstate-lock\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006764 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s4lq\" (UniqueName: \"kubernetes.io/projected/412a89db-473e-4710-ab4a-ecda68d76787-kube-api-access-6s4lq\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006781 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjs2m\" (UniqueName: \"kubernetes.io/projected/68767c95-c372-4de6-bab2-8eaae4cb37b3-kube-api-access-qjs2m\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006807 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-ovs-socket\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.006971 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-dbus-socket\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.032558 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7"] Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.033154 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.038147 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.038366 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xsb8v" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.038475 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.046832 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7"] Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.107824 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/68767c95-c372-4de6-bab2-8eaae4cb37b3-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.107876 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-nmstate-lock\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.107907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s4lq\" (UniqueName: \"kubernetes.io/projected/412a89db-473e-4710-ab4a-ecda68d76787-kube-api-access-6s4lq\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.107930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjs2m\" (UniqueName: \"kubernetes.io/projected/68767c95-c372-4de6-bab2-8eaae4cb37b3-kube-api-access-qjs2m\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.107972 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: E0221 21:56:31.107980 4717 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108022 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-ovs-socket\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: E0221 21:56:31.108053 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68767c95-c372-4de6-bab2-8eaae4cb37b3-tls-key-pair podName:68767c95-c372-4de6-bab2-8eaae4cb37b3 nodeName:}" failed. No retries permitted until 2026-02-21 21:56:31.608035918 +0000 UTC m=+606.389569540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/68767c95-c372-4de6-bab2-8eaae4cb37b3-tls-key-pair") pod "nmstate-webhook-866bcb46dc-g8tr4" (UID: "68767c95-c372-4de6-bab2-8eaae4cb37b3") : secret "openshift-nmstate-webhook" not found Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108072 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-ovs-socket\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-dbus-socket\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-nmstate-lock\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzgcf\" (UniqueName: \"kubernetes.io/projected/3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1-kube-api-access-bzgcf\") pod \"nmstate-metrics-58c85c668d-f4sk4\" (UID: \"3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108395 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265n2\" (UniqueName: \"kubernetes.io/projected/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-kube-api-access-265n2\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.108698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/412a89db-473e-4710-ab4a-ecda68d76787-dbus-socket\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.126784 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s4lq\" (UniqueName: \"kubernetes.io/projected/412a89db-473e-4710-ab4a-ecda68d76787-kube-api-access-6s4lq\") pod \"nmstate-handler-wls9d\" (UID: \"412a89db-473e-4710-ab4a-ecda68d76787\") " pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.143771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzgcf\" (UniqueName: \"kubernetes.io/projected/3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1-kube-api-access-bzgcf\") pod \"nmstate-metrics-58c85c668d-f4sk4\" (UID: \"3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.144606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjs2m\" (UniqueName: \"kubernetes.io/projected/68767c95-c372-4de6-bab2-8eaae4cb37b3-kube-api-access-qjs2m\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.209203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.209255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265n2\" (UniqueName: \"kubernetes.io/projected/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-kube-api-access-265n2\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.209302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.210117 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.225284 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.226255 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.235587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265n2\" (UniqueName: \"kubernetes.io/projected/bef85b90-a8e3-4dbe-b0e7-f57e585bbc15-kube-api-access-265n2\") pod \"nmstate-console-plugin-5c78fc5d65-sdlq7\" (UID: \"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.253909 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76f59b595d-tjb9g"] Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.254616 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.265315 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.271148 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f59b595d-tjb9g"] Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.311829 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccefef27-657f-412f-8bc9-19e3952e0a19-console-oauth-config\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.311878 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxsg\" (UniqueName: \"kubernetes.io/projected/ccefef27-657f-412f-8bc9-19e3952e0a19-kube-api-access-bxxsg\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.312013 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-oauth-serving-cert\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.312060 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-service-ca\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.312097 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-console-config\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.312122 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-trusted-ca-bundle\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.312223 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccefef27-657f-412f-8bc9-19e3952e0a19-console-serving-cert\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.346080 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccefef27-657f-412f-8bc9-19e3952e0a19-console-serving-cert\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccefef27-657f-412f-8bc9-19e3952e0a19-console-oauth-config\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414517 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxsg\" (UniqueName: \"kubernetes.io/projected/ccefef27-657f-412f-8bc9-19e3952e0a19-kube-api-access-bxxsg\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-oauth-serving-cert\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414566 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-service-ca\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414586 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-console-config\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.414604 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-trusted-ca-bundle\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.417899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccefef27-657f-412f-8bc9-19e3952e0a19-console-oauth-config\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.418808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-trusted-ca-bundle\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.421083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-service-ca\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.421567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-oauth-serving-cert\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.422237 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccefef27-657f-412f-8bc9-19e3952e0a19-console-serving-cert\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.427654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccefef27-657f-412f-8bc9-19e3952e0a19-console-config\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.429463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxsg\" (UniqueName: \"kubernetes.io/projected/ccefef27-657f-412f-8bc9-19e3952e0a19-kube-api-access-bxxsg\") pod \"console-76f59b595d-tjb9g\" (UID: \"ccefef27-657f-412f-8bc9-19e3952e0a19\") " pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.511794 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4"] Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.557199 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7"] Feb 21 21:56:31 crc kubenswrapper[4717]: W0221 21:56:31.558852 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbef85b90_a8e3_4dbe_b0e7_f57e585bbc15.slice/crio-cdc52db0d0644d2764db56aa9f2498a1b0b0bb1f0f9df611ae711565c9f0ca09 WatchSource:0}: Error finding container cdc52db0d0644d2764db56aa9f2498a1b0b0bb1f0f9df611ae711565c9f0ca09: Status 404 returned error can't find the container with id cdc52db0d0644d2764db56aa9f2498a1b0b0bb1f0f9df611ae711565c9f0ca09 Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.619300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/68767c95-c372-4de6-bab2-8eaae4cb37b3-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.627001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/68767c95-c372-4de6-bab2-8eaae4cb37b3-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-g8tr4\" (UID: \"68767c95-c372-4de6-bab2-8eaae4cb37b3\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.627373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.819349 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f59b595d-tjb9g"] Feb 21 21:56:31 crc kubenswrapper[4717]: W0221 21:56:31.822072 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccefef27_657f_412f_8bc9_19e3952e0a19.slice/crio-c82a9bbae5185411782060a61ad9fd57ab206271ce0bc9c98f31baa5a84732cd WatchSource:0}: Error finding container c82a9bbae5185411782060a61ad9fd57ab206271ce0bc9c98f31baa5a84732cd: Status 404 returned error can't find the container with id c82a9bbae5185411782060a61ad9fd57ab206271ce0bc9c98f31baa5a84732cd Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.837637 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.868549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wls9d" event={"ID":"412a89db-473e-4710-ab4a-ecda68d76787","Type":"ContainerStarted","Data":"06a4476c6c71242e4c4f4e6abb360e100f3e9de456310e32c65d34f25492cbe2"} Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.870127 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" event={"ID":"3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1","Type":"ContainerStarted","Data":"44068f6a1473eb15bae5a9d80f53c1ed6b936f90c5df86fc70852b9c3f947591"} Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.871212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f59b595d-tjb9g" event={"ID":"ccefef27-657f-412f-8bc9-19e3952e0a19","Type":"ContainerStarted","Data":"c82a9bbae5185411782060a61ad9fd57ab206271ce0bc9c98f31baa5a84732cd"} Feb 21 21:56:31 crc kubenswrapper[4717]: I0221 21:56:31.872220 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" event={"ID":"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15","Type":"ContainerStarted","Data":"cdc52db0d0644d2764db56aa9f2498a1b0b0bb1f0f9df611ae711565c9f0ca09"} Feb 21 21:56:32 crc kubenswrapper[4717]: I0221 21:56:32.051733 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4"] Feb 21 21:56:32 crc kubenswrapper[4717]: I0221 21:56:32.880608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" event={"ID":"68767c95-c372-4de6-bab2-8eaae4cb37b3","Type":"ContainerStarted","Data":"6685508b3ba78e86138d87c8b1fc42d7b9f744cabc3c64f3b895cbce6540ff7f"} Feb 21 21:56:32 crc kubenswrapper[4717]: I0221 21:56:32.883738 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f59b595d-tjb9g" event={"ID":"ccefef27-657f-412f-8bc9-19e3952e0a19","Type":"ContainerStarted","Data":"386c1a5770dbe82119fb2d82644b22d182d8d41a9e52a9850bbdcebdc62053e2"} Feb 21 21:56:32 crc kubenswrapper[4717]: I0221 21:56:32.911165 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76f59b595d-tjb9g" podStartSLOduration=1.911147116 podStartE2EDuration="1.911147116s" podCreationTimestamp="2026-02-21 21:56:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:56:32.906586937 +0000 UTC m=+607.688120559" watchObservedRunningTime="2026-02-21 21:56:32.911147116 +0000 UTC m=+607.692680748" Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.905747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" event={"ID":"3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1","Type":"ContainerStarted","Data":"d11dd4efc1135bab860d664475e3c3fe2bf78d822f1298012fab818999a0df3b"} Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.910954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" event={"ID":"68767c95-c372-4de6-bab2-8eaae4cb37b3","Type":"ContainerStarted","Data":"c23982439302fda34542f315e45b54857c57062a9310000180ce733e430970ab"} Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.911490 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.916309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" event={"ID":"bef85b90-a8e3-4dbe-b0e7-f57e585bbc15","Type":"ContainerStarted","Data":"d24d4518eca2800223decd2876697598b3c711aa1aecd6d8cdbf2b7fe907c571"} Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.917930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wls9d" event={"ID":"412a89db-473e-4710-ab4a-ecda68d76787","Type":"ContainerStarted","Data":"b20e576dfe0525ef900d89435a2b16959cbab0c13e154ec253dd4dba3aa46c19"} Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.918081 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.940581 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" podStartSLOduration=2.5406388829999997 podStartE2EDuration="4.940561248s" podCreationTimestamp="2026-02-21 21:56:30 +0000 UTC" firstStartedPulling="2026-02-21 21:56:32.062258814 +0000 UTC m=+606.843792436" lastFinishedPulling="2026-02-21 21:56:34.462181179 +0000 UTC m=+609.243714801" observedRunningTime="2026-02-21 21:56:34.936715886 +0000 UTC m=+609.718249518" watchObservedRunningTime="2026-02-21 21:56:34.940561248 +0000 UTC m=+609.722094890" Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.960215 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wls9d" podStartSLOduration=1.824139331 podStartE2EDuration="4.960190665s" podCreationTimestamp="2026-02-21 21:56:30 +0000 UTC" firstStartedPulling="2026-02-21 21:56:31.327020226 +0000 UTC m=+606.108553848" lastFinishedPulling="2026-02-21 21:56:34.46307152 +0000 UTC m=+609.244605182" observedRunningTime="2026-02-21 21:56:34.954890939 +0000 UTC m=+609.736424571" watchObservedRunningTime="2026-02-21 21:56:34.960190665 +0000 UTC m=+609.741724317" Feb 21 21:56:34 crc kubenswrapper[4717]: I0221 21:56:34.974272 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-sdlq7" podStartSLOduration=1.082157729 podStartE2EDuration="3.974247469s" podCreationTimestamp="2026-02-21 21:56:31 +0000 UTC" firstStartedPulling="2026-02-21 21:56:31.560184313 +0000 UTC m=+606.341717935" lastFinishedPulling="2026-02-21 21:56:34.452274013 +0000 UTC m=+609.233807675" observedRunningTime="2026-02-21 21:56:34.970765676 +0000 UTC m=+609.752299308" watchObservedRunningTime="2026-02-21 21:56:34.974247469 +0000 UTC m=+609.755781121" Feb 21 21:56:37 crc kubenswrapper[4717]: I0221 21:56:37.941528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" event={"ID":"3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1","Type":"ContainerStarted","Data":"9db18f0a2cca54336b35f822a53b08d52fb6452a5354d912f808cc0cb9bdfc8f"} Feb 21 21:56:37 crc kubenswrapper[4717]: I0221 21:56:37.974091 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-f4sk4" podStartSLOduration=2.521543278 podStartE2EDuration="7.974063932s" podCreationTimestamp="2026-02-21 21:56:30 +0000 UTC" firstStartedPulling="2026-02-21 21:56:31.519402922 +0000 UTC m=+606.300936544" lastFinishedPulling="2026-02-21 21:56:36.971923576 +0000 UTC m=+611.753457198" observedRunningTime="2026-02-21 21:56:37.963704096 +0000 UTC m=+612.745237728" watchObservedRunningTime="2026-02-21 21:56:37.974063932 +0000 UTC m=+612.755597594" Feb 21 21:56:41 crc kubenswrapper[4717]: I0221 21:56:41.295527 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wls9d" Feb 21 21:56:41 crc kubenswrapper[4717]: I0221 21:56:41.627829 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:41 crc kubenswrapper[4717]: I0221 21:56:41.627912 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:41 crc kubenswrapper[4717]: I0221 21:56:41.640461 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:41 crc kubenswrapper[4717]: I0221 21:56:41.989516 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76f59b595d-tjb9g" Feb 21 21:56:42 crc kubenswrapper[4717]: I0221 21:56:42.059387 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-47lf7"] Feb 21 21:56:51 crc kubenswrapper[4717]: I0221 21:56:51.848159 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-g8tr4" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.133243 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-47lf7" podUID="17bb07e6-67dd-4cc5-b979-9ef794228e81" containerName="console" containerID="cri-o://d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd" gracePeriod=15 Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.523818 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-47lf7_17bb07e6-67dd-4cc5-b979-9ef794228e81/console/0.log" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.524394 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639200 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-service-ca\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639290 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-278b8\" (UniqueName: \"kubernetes.io/projected/17bb07e6-67dd-4cc5-b979-9ef794228e81-kube-api-access-278b8\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639345 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-trusted-ca-bundle\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639424 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-oauth-serving-cert\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639473 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-serving-cert\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-oauth-config\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.639609 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-config\") pod \"17bb07e6-67dd-4cc5-b979-9ef794228e81\" (UID: \"17bb07e6-67dd-4cc5-b979-9ef794228e81\") " Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.640335 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-service-ca" (OuterVolumeSpecName: "service-ca") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.640531 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.641127 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.641802 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-config" (OuterVolumeSpecName: "console-config") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.649135 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17bb07e6-67dd-4cc5-b979-9ef794228e81-kube-api-access-278b8" (OuterVolumeSpecName: "kube-api-access-278b8") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "kube-api-access-278b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.650373 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.652432 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "17bb07e6-67dd-4cc5-b979-9ef794228e81" (UID: "17bb07e6-67dd-4cc5-b979-9ef794228e81"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741234 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741482 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741566 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741640 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-console-config\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741703 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-service-ca\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741776 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-278b8\" (UniqueName: \"kubernetes.io/projected/17bb07e6-67dd-4cc5-b979-9ef794228e81-kube-api-access-278b8\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:07 crc kubenswrapper[4717]: I0221 21:57:07.741849 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17bb07e6-67dd-4cc5-b979-9ef794228e81-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.180737 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-47lf7_17bb07e6-67dd-4cc5-b979-9ef794228e81/console/0.log" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.180988 4717 generic.go:334] "Generic (PLEG): container finished" podID="17bb07e6-67dd-4cc5-b979-9ef794228e81" containerID="d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd" exitCode=2 Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.181030 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-47lf7" event={"ID":"17bb07e6-67dd-4cc5-b979-9ef794228e81","Type":"ContainerDied","Data":"d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd"} Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.181069 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-47lf7" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.181084 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-47lf7" event={"ID":"17bb07e6-67dd-4cc5-b979-9ef794228e81","Type":"ContainerDied","Data":"6f8bf24181fcd83bc0444819a04171cb4c2426413dde130d90a805ca205d20a6"} Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.181103 4717 scope.go:117] "RemoveContainer" containerID="d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.201101 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-47lf7"] Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.206993 4717 scope.go:117] "RemoveContainer" containerID="d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd" Feb 21 21:57:08 crc kubenswrapper[4717]: E0221 21:57:08.207448 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd\": container with ID starting with d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd not found: ID does not exist" containerID="d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.207476 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd"} err="failed to get container status \"d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd\": rpc error: code = NotFound desc = could not find container \"d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd\": container with ID starting with d5234076e40cdd477af37c25c5f50bc96be68a8f5e1142e726f1bf1088f02abd not found: ID does not exist" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.207817 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-47lf7"] Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.453993 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff"] Feb 21 21:57:08 crc kubenswrapper[4717]: E0221 21:57:08.454474 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bb07e6-67dd-4cc5-b979-9ef794228e81" containerName="console" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.454532 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bb07e6-67dd-4cc5-b979-9ef794228e81" containerName="console" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.454757 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="17bb07e6-67dd-4cc5-b979-9ef794228e81" containerName="console" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.456617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.464639 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff"] Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.465149 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.652574 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.652834 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqrxz\" (UniqueName: \"kubernetes.io/projected/c14c7d98-9c2b-456a-9dc8-0857592681bb-kube-api-access-pqrxz\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.652989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.754144 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqrxz\" (UniqueName: \"kubernetes.io/projected/c14c7d98-9c2b-456a-9dc8-0857592681bb-kube-api-access-pqrxz\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.754307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.754375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.755207 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.755338 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:08 crc kubenswrapper[4717]: I0221 21:57:08.791076 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqrxz\" (UniqueName: \"kubernetes.io/projected/c14c7d98-9c2b-456a-9dc8-0857592681bb-kube-api-access-pqrxz\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:09 crc kubenswrapper[4717]: I0221 21:57:09.087649 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:09 crc kubenswrapper[4717]: I0221 21:57:09.408502 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff"] Feb 21 21:57:09 crc kubenswrapper[4717]: I0221 21:57:09.985286 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17bb07e6-67dd-4cc5-b979-9ef794228e81" path="/var/lib/kubelet/pods/17bb07e6-67dd-4cc5-b979-9ef794228e81/volumes" Feb 21 21:57:10 crc kubenswrapper[4717]: I0221 21:57:10.197248 4717 generic.go:334] "Generic (PLEG): container finished" podID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerID="8d5b25e253d576593be03caea59ecaae8c8ecf7f68651b62504bf1fdff819a5a" exitCode=0 Feb 21 21:57:10 crc kubenswrapper[4717]: I0221 21:57:10.197336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" event={"ID":"c14c7d98-9c2b-456a-9dc8-0857592681bb","Type":"ContainerDied","Data":"8d5b25e253d576593be03caea59ecaae8c8ecf7f68651b62504bf1fdff819a5a"} Feb 21 21:57:10 crc kubenswrapper[4717]: I0221 21:57:10.197734 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" event={"ID":"c14c7d98-9c2b-456a-9dc8-0857592681bb","Type":"ContainerStarted","Data":"63a076d1e88a830ad2208bd5453a0a8db88730da68f57cdf58102ba31eab0cf2"} Feb 21 21:57:13 crc kubenswrapper[4717]: I0221 21:57:13.244507 4717 generic.go:334] "Generic (PLEG): container finished" podID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerID="4349bd00785886464718f5a938b7c35ba4293e077f6585391dd9521323f514a7" exitCode=0 Feb 21 21:57:13 crc kubenswrapper[4717]: I0221 21:57:13.244589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" event={"ID":"c14c7d98-9c2b-456a-9dc8-0857592681bb","Type":"ContainerDied","Data":"4349bd00785886464718f5a938b7c35ba4293e077f6585391dd9521323f514a7"} Feb 21 21:57:14 crc kubenswrapper[4717]: I0221 21:57:14.255550 4717 generic.go:334] "Generic (PLEG): container finished" podID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerID="210883a4d74f9191828439afead3b38a26e0fd6cb8b64ba34602f063c1e191b9" exitCode=0 Feb 21 21:57:14 crc kubenswrapper[4717]: I0221 21:57:14.255682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" event={"ID":"c14c7d98-9c2b-456a-9dc8-0857592681bb","Type":"ContainerDied","Data":"210883a4d74f9191828439afead3b38a26e0fd6cb8b64ba34602f063c1e191b9"} Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.601439 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.648510 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-bundle\") pod \"c14c7d98-9c2b-456a-9dc8-0857592681bb\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.648612 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-util\") pod \"c14c7d98-9c2b-456a-9dc8-0857592681bb\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.648704 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqrxz\" (UniqueName: \"kubernetes.io/projected/c14c7d98-9c2b-456a-9dc8-0857592681bb-kube-api-access-pqrxz\") pod \"c14c7d98-9c2b-456a-9dc8-0857592681bb\" (UID: \"c14c7d98-9c2b-456a-9dc8-0857592681bb\") " Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.650830 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-bundle" (OuterVolumeSpecName: "bundle") pod "c14c7d98-9c2b-456a-9dc8-0857592681bb" (UID: "c14c7d98-9c2b-456a-9dc8-0857592681bb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.661911 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14c7d98-9c2b-456a-9dc8-0857592681bb-kube-api-access-pqrxz" (OuterVolumeSpecName: "kube-api-access-pqrxz") pod "c14c7d98-9c2b-456a-9dc8-0857592681bb" (UID: "c14c7d98-9c2b-456a-9dc8-0857592681bb"). InnerVolumeSpecName "kube-api-access-pqrxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.662933 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-util" (OuterVolumeSpecName: "util") pod "c14c7d98-9c2b-456a-9dc8-0857592681bb" (UID: "c14c7d98-9c2b-456a-9dc8-0857592681bb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.750293 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqrxz\" (UniqueName: \"kubernetes.io/projected/c14c7d98-9c2b-456a-9dc8-0857592681bb-kube-api-access-pqrxz\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.750343 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:15 crc kubenswrapper[4717]: I0221 21:57:15.750361 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c14c7d98-9c2b-456a-9dc8-0857592681bb-util\") on node \"crc\" DevicePath \"\"" Feb 21 21:57:16 crc kubenswrapper[4717]: I0221 21:57:16.300604 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" event={"ID":"c14c7d98-9c2b-456a-9dc8-0857592681bb","Type":"ContainerDied","Data":"63a076d1e88a830ad2208bd5453a0a8db88730da68f57cdf58102ba31eab0cf2"} Feb 21 21:57:16 crc kubenswrapper[4717]: I0221 21:57:16.300790 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a076d1e88a830ad2208bd5453a0a8db88730da68f57cdf58102ba31eab0cf2" Feb 21 21:57:16 crc kubenswrapper[4717]: I0221 21:57:16.300796 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.622617 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv"] Feb 21 21:57:26 crc kubenswrapper[4717]: E0221 21:57:26.623335 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="pull" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.623348 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="pull" Feb 21 21:57:26 crc kubenswrapper[4717]: E0221 21:57:26.623361 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="extract" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.623367 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="extract" Feb 21 21:57:26 crc kubenswrapper[4717]: E0221 21:57:26.623379 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="util" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.623385 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="util" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.623481 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14c7d98-9c2b-456a-9dc8-0857592681bb" containerName="extract" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.623920 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.626341 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7jnt4" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.626662 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.626780 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.626892 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.627128 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.641217 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv"] Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.711476 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c9c10bb-4b07-4f5e-af44-0353e53010d1-apiservice-cert\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.711551 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccrm\" (UniqueName: \"kubernetes.io/projected/2c9c10bb-4b07-4f5e-af44-0353e53010d1-kube-api-access-bccrm\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.711594 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c9c10bb-4b07-4f5e-af44-0353e53010d1-webhook-cert\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.812487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccrm\" (UniqueName: \"kubernetes.io/projected/2c9c10bb-4b07-4f5e-af44-0353e53010d1-kube-api-access-bccrm\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.812764 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c9c10bb-4b07-4f5e-af44-0353e53010d1-webhook-cert\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.812943 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c9c10bb-4b07-4f5e-af44-0353e53010d1-apiservice-cert\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.818352 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c9c10bb-4b07-4f5e-af44-0353e53010d1-webhook-cert\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.818473 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c9c10bb-4b07-4f5e-af44-0353e53010d1-apiservice-cert\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.836196 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccrm\" (UniqueName: \"kubernetes.io/projected/2c9c10bb-4b07-4f5e-af44-0353e53010d1-kube-api-access-bccrm\") pod \"metallb-operator-controller-manager-644bd788d-hj4nv\" (UID: \"2c9c10bb-4b07-4f5e-af44-0353e53010d1\") " pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.898721 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk"] Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.899760 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.903104 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8z8r7" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.905672 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.905949 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.913699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-webhook-cert\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.913741 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-apiservice-cert\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.913761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vzp4\" (UniqueName: \"kubernetes.io/projected/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-kube-api-access-4vzp4\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.917326 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk"] Feb 21 21:57:26 crc kubenswrapper[4717]: I0221 21:57:26.937558 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.015229 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-webhook-cert\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.015544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-apiservice-cert\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.015561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vzp4\" (UniqueName: \"kubernetes.io/projected/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-kube-api-access-4vzp4\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.020754 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-apiservice-cert\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.020799 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-webhook-cert\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.032368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vzp4\" (UniqueName: \"kubernetes.io/projected/d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95-kube-api-access-4vzp4\") pod \"metallb-operator-webhook-server-7cdb748cc4-h69fk\" (UID: \"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95\") " pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.222616 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.444082 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv"] Feb 21 21:57:27 crc kubenswrapper[4717]: I0221 21:57:27.526693 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk"] Feb 21 21:57:27 crc kubenswrapper[4717]: W0221 21:57:27.533309 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd39f5f5c_c8f9_4413_a6a7_abd0aa1beb95.slice/crio-5d82620e047a977b34ca6fae871f5df45236cdffa78c736d34921781aa2f5bbe WatchSource:0}: Error finding container 5d82620e047a977b34ca6fae871f5df45236cdffa78c736d34921781aa2f5bbe: Status 404 returned error can't find the container with id 5d82620e047a977b34ca6fae871f5df45236cdffa78c736d34921781aa2f5bbe Feb 21 21:57:28 crc kubenswrapper[4717]: I0221 21:57:28.383071 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" event={"ID":"2c9c10bb-4b07-4f5e-af44-0353e53010d1","Type":"ContainerStarted","Data":"bde1ccb28657db02dd319fc452de3d71ddd056bf5ab71e734ed81fc3d170ce9b"} Feb 21 21:57:28 crc kubenswrapper[4717]: I0221 21:57:28.384630 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" event={"ID":"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95","Type":"ContainerStarted","Data":"5d82620e047a977b34ca6fae871f5df45236cdffa78c736d34921781aa2f5bbe"} Feb 21 21:57:32 crc kubenswrapper[4717]: I0221 21:57:32.415787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" event={"ID":"2c9c10bb-4b07-4f5e-af44-0353e53010d1","Type":"ContainerStarted","Data":"e313d02a7547eb0422b97babf7b2c76c081484cbb5c62bb0d991b93502462017"} Feb 21 21:57:32 crc kubenswrapper[4717]: I0221 21:57:32.416681 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:57:32 crc kubenswrapper[4717]: I0221 21:57:32.419532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" event={"ID":"d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95","Type":"ContainerStarted","Data":"dc48ea6677bc153275b8c28d3afb975955b7672f0cca5e36f6540e1ae17a07e3"} Feb 21 21:57:32 crc kubenswrapper[4717]: I0221 21:57:32.420163 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:57:32 crc kubenswrapper[4717]: I0221 21:57:32.450795 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" podStartSLOduration=2.045773212 podStartE2EDuration="6.450770613s" podCreationTimestamp="2026-02-21 21:57:26 +0000 UTC" firstStartedPulling="2026-02-21 21:57:27.473107925 +0000 UTC m=+662.254641547" lastFinishedPulling="2026-02-21 21:57:31.878105316 +0000 UTC m=+666.659638948" observedRunningTime="2026-02-21 21:57:32.438002945 +0000 UTC m=+667.219536607" watchObservedRunningTime="2026-02-21 21:57:32.450770613 +0000 UTC m=+667.232304275" Feb 21 21:57:32 crc kubenswrapper[4717]: I0221 21:57:32.470920 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" podStartSLOduration=2.112345444 podStartE2EDuration="6.470895387s" podCreationTimestamp="2026-02-21 21:57:26 +0000 UTC" firstStartedPulling="2026-02-21 21:57:27.537918824 +0000 UTC m=+662.319452446" lastFinishedPulling="2026-02-21 21:57:31.896468757 +0000 UTC m=+666.678002389" observedRunningTime="2026-02-21 21:57:32.466946941 +0000 UTC m=+667.248480603" watchObservedRunningTime="2026-02-21 21:57:32.470895387 +0000 UTC m=+667.252429049" Feb 21 21:57:47 crc kubenswrapper[4717]: I0221 21:57:47.231474 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7cdb748cc4-h69fk" Feb 21 21:58:06 crc kubenswrapper[4717]: I0221 21:58:06.941647 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-644bd788d-hj4nv" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.796643 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl"] Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.797347 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.799042 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.800208 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4thzk" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.817459 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-n9gwt"] Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.819796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.823085 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.823304 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.855000 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl"] Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.921275 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-r4xwk"] Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.922404 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-r4xwk" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.923938 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.924068 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pmcnf" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.924146 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.924735 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.936264 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-gq99r"] Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.937491 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.939236 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.949956 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-gq99r"] Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-metrics-certs\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-metrics\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-reloader\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994361 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m9jp\" (UniqueName: \"kubernetes.io/projected/75d38a7a-9bcd-49d6-812c-6d451c933f87-kube-api-access-9m9jp\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994384 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-metrics-certs\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994416 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl2p9\" (UniqueName: \"kubernetes.io/projected/0f79dc1a-f5c3-4b0b-827f-1aeb3729478e-kube-api-access-gl2p9\") pod \"frr-k8s-webhook-server-78b44bf5bb-bklbl\" (UID: \"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75d38a7a-9bcd-49d6-812c-6d451c933f87-metrics-certs\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994471 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-sockets\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-conf\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f79dc1a-f5c3-4b0b-827f-1aeb3729478e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bklbl\" (UID: \"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994579 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v62td\" (UniqueName: \"kubernetes.io/projected/1063cb22-b437-4152-913e-9673c0a51b7a-kube-api-access-v62td\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994619 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-startup\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994646 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d38a7a-9bcd-49d6-812c-6d451c933f87-cert\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994709 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1063cb22-b437-4152-913e-9673c0a51b7a-metallb-excludel2\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:07 crc kubenswrapper[4717]: I0221 21:58:07.994737 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4kj4\" (UniqueName: \"kubernetes.io/projected/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-kube-api-access-l4kj4\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f79dc1a-f5c3-4b0b-827f-1aeb3729478e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bklbl\" (UID: \"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v62td\" (UniqueName: \"kubernetes.io/projected/1063cb22-b437-4152-913e-9673c0a51b7a-kube-api-access-v62td\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095453 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-startup\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095488 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d38a7a-9bcd-49d6-812c-6d451c933f87-cert\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095532 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1063cb22-b437-4152-913e-9673c0a51b7a-metallb-excludel2\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4kj4\" (UniqueName: \"kubernetes.io/projected/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-kube-api-access-l4kj4\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-metrics-certs\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-metrics\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-reloader\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095704 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m9jp\" (UniqueName: \"kubernetes.io/projected/75d38a7a-9bcd-49d6-812c-6d451c933f87-kube-api-access-9m9jp\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095735 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-metrics-certs\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl2p9\" (UniqueName: \"kubernetes.io/projected/0f79dc1a-f5c3-4b0b-827f-1aeb3729478e-kube-api-access-gl2p9\") pod \"frr-k8s-webhook-server-78b44bf5bb-bklbl\" (UID: \"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095809 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75d38a7a-9bcd-49d6-812c-6d451c933f87-metrics-certs\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-sockets\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095914 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.095953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-conf\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.096454 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-metrics\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.096582 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-conf\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.096891 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-reloader\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: E0221 21:58:08.096893 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 21 21:58:08 crc kubenswrapper[4717]: E0221 21:58:08.096972 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist podName:1063cb22-b437-4152-913e-9673c0a51b7a nodeName:}" failed. No retries permitted until 2026-02-21 21:58:08.596955824 +0000 UTC m=+703.378489446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist") pod "speaker-r4xwk" (UID: "1063cb22-b437-4152-913e-9673c0a51b7a") : secret "metallb-memberlist" not found Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.097006 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-sockets\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.097071 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1063cb22-b437-4152-913e-9673c0a51b7a-metallb-excludel2\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.098897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-frr-startup\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.102540 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75d38a7a-9bcd-49d6-812c-6d451c933f87-metrics-certs\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.104010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d38a7a-9bcd-49d6-812c-6d451c933f87-cert\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.104331 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f79dc1a-f5c3-4b0b-827f-1aeb3729478e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-bklbl\" (UID: \"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.105418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-metrics-certs\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.115702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-metrics-certs\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.120646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v62td\" (UniqueName: \"kubernetes.io/projected/1063cb22-b437-4152-913e-9673c0a51b7a-kube-api-access-v62td\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.121341 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl2p9\" (UniqueName: \"kubernetes.io/projected/0f79dc1a-f5c3-4b0b-827f-1aeb3729478e-kube-api-access-gl2p9\") pod \"frr-k8s-webhook-server-78b44bf5bb-bklbl\" (UID: \"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.124984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4kj4\" (UniqueName: \"kubernetes.io/projected/c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07-kube-api-access-l4kj4\") pod \"frr-k8s-n9gwt\" (UID: \"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07\") " pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.125983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m9jp\" (UniqueName: \"kubernetes.io/projected/75d38a7a-9bcd-49d6-812c-6d451c933f87-kube-api-access-9m9jp\") pod \"controller-69bbfbf88f-gq99r\" (UID: \"75d38a7a-9bcd-49d6-812c-6d451c933f87\") " pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.134517 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.257916 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.414735 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.597621 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl"] Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.601378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:08 crc kubenswrapper[4717]: E0221 21:58:08.601560 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 21 21:58:08 crc kubenswrapper[4717]: E0221 21:58:08.601688 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist podName:1063cb22-b437-4152-913e-9673c0a51b7a nodeName:}" failed. No retries permitted until 2026-02-21 21:58:09.601666737 +0000 UTC m=+704.383200369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist") pod "speaker-r4xwk" (UID: "1063cb22-b437-4152-913e-9673c0a51b7a") : secret "metallb-memberlist" not found Feb 21 21:58:08 crc kubenswrapper[4717]: W0221 21:58:08.603767 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f79dc1a_f5c3_4b0b_827f_1aeb3729478e.slice/crio-1e6e167f7a599abe31aba93d8e3d4ff0c13dd3725011517da51298ba5f1b4d22 WatchSource:0}: Error finding container 1e6e167f7a599abe31aba93d8e3d4ff0c13dd3725011517da51298ba5f1b4d22: Status 404 returned error can't find the container with id 1e6e167f7a599abe31aba93d8e3d4ff0c13dd3725011517da51298ba5f1b4d22 Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.666522 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-gq99r"] Feb 21 21:58:08 crc kubenswrapper[4717]: W0221 21:58:08.669087 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d38a7a_9bcd_49d6_812c_6d451c933f87.slice/crio-a18c4a620d31dd0a68c965b4a929adb32920261df18bbc286480f743473a967d WatchSource:0}: Error finding container a18c4a620d31dd0a68c965b4a929adb32920261df18bbc286480f743473a967d: Status 404 returned error can't find the container with id a18c4a620d31dd0a68c965b4a929adb32920261df18bbc286480f743473a967d Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.702702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" event={"ID":"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e","Type":"ContainerStarted","Data":"1e6e167f7a599abe31aba93d8e3d4ff0c13dd3725011517da51298ba5f1b4d22"} Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.703823 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-gq99r" event={"ID":"75d38a7a-9bcd-49d6-812c-6d451c933f87","Type":"ContainerStarted","Data":"a18c4a620d31dd0a68c965b4a929adb32920261df18bbc286480f743473a967d"} Feb 21 21:58:08 crc kubenswrapper[4717]: I0221 21:58:08.704770 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"7066d81adf8b6d2f3fa704aa3759522420b86044799a07bed786c8485c06213a"} Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.063071 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.063145 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.618337 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.625419 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1063cb22-b437-4152-913e-9673c0a51b7a-memberlist\") pod \"speaker-r4xwk\" (UID: \"1063cb22-b437-4152-913e-9673c0a51b7a\") " pod="metallb-system/speaker-r4xwk" Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.712741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-gq99r" event={"ID":"75d38a7a-9bcd-49d6-812c-6d451c933f87","Type":"ContainerStarted","Data":"d1fa6a98e3befea01735b58cb56cf91cf8d367ec9068df12cb527ddbb7d772ac"} Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.713059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.713156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-gq99r" event={"ID":"75d38a7a-9bcd-49d6-812c-6d451c933f87","Type":"ContainerStarted","Data":"19bdf7c0f247748c8cfda43acd27830518f9aab79dba2c7779089357700563e6"} Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.736980 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-r4xwk" Feb 21 21:58:09 crc kubenswrapper[4717]: I0221 21:58:09.750420 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-gq99r" podStartSLOduration=2.750402014 podStartE2EDuration="2.750402014s" podCreationTimestamp="2026-02-21 21:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:58:09.750090856 +0000 UTC m=+704.531624478" watchObservedRunningTime="2026-02-21 21:58:09.750402014 +0000 UTC m=+704.531935636" Feb 21 21:58:10 crc kubenswrapper[4717]: I0221 21:58:10.721086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-r4xwk" event={"ID":"1063cb22-b437-4152-913e-9673c0a51b7a","Type":"ContainerStarted","Data":"9df0f187f0b3632ab7371d033bc8906280b06d1440e8280081715901a7e0f122"} Feb 21 21:58:10 crc kubenswrapper[4717]: I0221 21:58:10.721427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-r4xwk" event={"ID":"1063cb22-b437-4152-913e-9673c0a51b7a","Type":"ContainerStarted","Data":"490c719edba9d9e89111bb8488300b23fcd5c9fca16f8f1d9d77fdb1e905db77"} Feb 21 21:58:10 crc kubenswrapper[4717]: I0221 21:58:10.721438 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-r4xwk" event={"ID":"1063cb22-b437-4152-913e-9673c0a51b7a","Type":"ContainerStarted","Data":"f8565c978785dae1179ef49bc17d859476d49314b2b9b63c8a7b6eead9f1a442"} Feb 21 21:58:10 crc kubenswrapper[4717]: I0221 21:58:10.722024 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-r4xwk" Feb 21 21:58:10 crc kubenswrapper[4717]: I0221 21:58:10.744401 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-r4xwk" podStartSLOduration=3.7443793100000002 podStartE2EDuration="3.74437931s" podCreationTimestamp="2026-02-21 21:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 21:58:10.739455192 +0000 UTC m=+705.520988824" watchObservedRunningTime="2026-02-21 21:58:10.74437931 +0000 UTC m=+705.525912932" Feb 21 21:58:15 crc kubenswrapper[4717]: I0221 21:58:15.756638 4717 generic.go:334] "Generic (PLEG): container finished" podID="c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07" containerID="219bca1ff34d352bf2d8c2130877b84bfce875ac2017b71dcd02c5e89c423438" exitCode=0 Feb 21 21:58:15 crc kubenswrapper[4717]: I0221 21:58:15.756761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerDied","Data":"219bca1ff34d352bf2d8c2130877b84bfce875ac2017b71dcd02c5e89c423438"} Feb 21 21:58:15 crc kubenswrapper[4717]: I0221 21:58:15.759851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" event={"ID":"0f79dc1a-f5c3-4b0b-827f-1aeb3729478e","Type":"ContainerStarted","Data":"abc16b4aaafe7cad717d89001eb29d53297dd2b96093bca70f84c4621b309a94"} Feb 21 21:58:15 crc kubenswrapper[4717]: I0221 21:58:15.760091 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:15 crc kubenswrapper[4717]: I0221 21:58:15.842049 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" podStartSLOduration=2.108061 podStartE2EDuration="8.842020851s" podCreationTimestamp="2026-02-21 21:58:07 +0000 UTC" firstStartedPulling="2026-02-21 21:58:08.606151345 +0000 UTC m=+703.387684967" lastFinishedPulling="2026-02-21 21:58:15.340111196 +0000 UTC m=+710.121644818" observedRunningTime="2026-02-21 21:58:15.822306868 +0000 UTC m=+710.603840540" watchObservedRunningTime="2026-02-21 21:58:15.842020851 +0000 UTC m=+710.623554543" Feb 21 21:58:16 crc kubenswrapper[4717]: I0221 21:58:16.770362 4717 generic.go:334] "Generic (PLEG): container finished" podID="c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07" containerID="428e3600bbb923c3f5a18257d98e373b413d55e9f2aa3217726abb590c8817e5" exitCode=0 Feb 21 21:58:16 crc kubenswrapper[4717]: I0221 21:58:16.770467 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerDied","Data":"428e3600bbb923c3f5a18257d98e373b413d55e9f2aa3217726abb590c8817e5"} Feb 21 21:58:17 crc kubenswrapper[4717]: I0221 21:58:17.780346 4717 generic.go:334] "Generic (PLEG): container finished" podID="c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07" containerID="a13c3bff290d6094645c8c5e1a7ed5f1f71a4f8ce5514533bda3115fca7896b4" exitCode=0 Feb 21 21:58:17 crc kubenswrapper[4717]: I0221 21:58:17.780452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerDied","Data":"a13c3bff290d6094645c8c5e1a7ed5f1f71a4f8ce5514533bda3115fca7896b4"} Feb 21 21:58:18 crc kubenswrapper[4717]: I0221 21:58:18.263770 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-gq99r" Feb 21 21:58:18 crc kubenswrapper[4717]: I0221 21:58:18.797818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"506e84bf2aed66b87f9facd7703672100c5d8d8b3bb79168ef87a39d7e09cd08"} Feb 21 21:58:18 crc kubenswrapper[4717]: I0221 21:58:18.797923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"4313adfe12cc9c3dc932b05f56385ea568c3735a0e8ad3c8f7726d9af256487c"} Feb 21 21:58:18 crc kubenswrapper[4717]: I0221 21:58:18.797952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"f3431c62561e4a91ea4ab1708c580839acb4f70f945f96555c21187ad19b6088"} Feb 21 21:58:18 crc kubenswrapper[4717]: I0221 21:58:18.797976 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"de6d6df67e5a3c78bc2627e6a1d0deb1c3972c343ee881019d165a180fd84ff1"} Feb 21 21:58:19 crc kubenswrapper[4717]: I0221 21:58:19.831352 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"aeda518197f4dac9da405c5ff87ffb2abdc784aa58868f97ee197c96daaefd4e"} Feb 21 21:58:19 crc kubenswrapper[4717]: I0221 21:58:19.831647 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:19 crc kubenswrapper[4717]: I0221 21:58:19.831661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9gwt" event={"ID":"c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07","Type":"ContainerStarted","Data":"ab714623295016a94795a113be4e25f839a913d85279a8926eb1d579a833b8fe"} Feb 21 21:58:19 crc kubenswrapper[4717]: I0221 21:58:19.906667 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-n9gwt" podStartSLOduration=5.820714237 podStartE2EDuration="12.906633158s" podCreationTimestamp="2026-02-21 21:58:07 +0000 UTC" firstStartedPulling="2026-02-21 21:58:08.270047605 +0000 UTC m=+703.051581227" lastFinishedPulling="2026-02-21 21:58:15.355966526 +0000 UTC m=+710.137500148" observedRunningTime="2026-02-21 21:58:19.872525169 +0000 UTC m=+714.654058831" watchObservedRunningTime="2026-02-21 21:58:19.906633158 +0000 UTC m=+714.688166820" Feb 21 21:58:23 crc kubenswrapper[4717]: I0221 21:58:23.134929 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:23 crc kubenswrapper[4717]: I0221 21:58:23.187463 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:26 crc kubenswrapper[4717]: I0221 21:58:26.464732 4717 scope.go:117] "RemoveContainer" containerID="317f93d3863e5304a45d79f985ffc6d3c24d84ef7dbec3f40bd0b0d15c375815" Feb 21 21:58:26 crc kubenswrapper[4717]: I0221 21:58:26.496763 4717 scope.go:117] "RemoveContainer" containerID="b00b5a03bb3c840369c6d9c97e1d7e226037d758e6cb6b261d1ea79a33f8716d" Feb 21 21:58:28 crc kubenswrapper[4717]: I0221 21:58:28.139981 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-n9gwt" Feb 21 21:58:28 crc kubenswrapper[4717]: I0221 21:58:28.423503 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-bklbl" Feb 21 21:58:29 crc kubenswrapper[4717]: I0221 21:58:29.741524 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-r4xwk" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.725681 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9gcb4"] Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.727125 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.729300 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-t7f46" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.729633 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.731279 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.749111 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9gcb4"] Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.863690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27sz2\" (UniqueName: \"kubernetes.io/projected/52a5f567-3dbf-4bfa-8577-800d38f1de0d-kube-api-access-27sz2\") pod \"openstack-operator-index-9gcb4\" (UID: \"52a5f567-3dbf-4bfa-8577-800d38f1de0d\") " pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.965456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27sz2\" (UniqueName: \"kubernetes.io/projected/52a5f567-3dbf-4bfa-8577-800d38f1de0d-kube-api-access-27sz2\") pod \"openstack-operator-index-9gcb4\" (UID: \"52a5f567-3dbf-4bfa-8577-800d38f1de0d\") " pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:32 crc kubenswrapper[4717]: I0221 21:58:32.984387 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27sz2\" (UniqueName: \"kubernetes.io/projected/52a5f567-3dbf-4bfa-8577-800d38f1de0d-kube-api-access-27sz2\") pod \"openstack-operator-index-9gcb4\" (UID: \"52a5f567-3dbf-4bfa-8577-800d38f1de0d\") " pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:33 crc kubenswrapper[4717]: I0221 21:58:33.047949 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:33 crc kubenswrapper[4717]: I0221 21:58:33.288907 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9gcb4"] Feb 21 21:58:33 crc kubenswrapper[4717]: W0221 21:58:33.297071 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52a5f567_3dbf_4bfa_8577_800d38f1de0d.slice/crio-3ef05d1bd8814429a4bfa30873ed9ece1aa61d13006ecd823e445b88654b1fa0 WatchSource:0}: Error finding container 3ef05d1bd8814429a4bfa30873ed9ece1aa61d13006ecd823e445b88654b1fa0: Status 404 returned error can't find the container with id 3ef05d1bd8814429a4bfa30873ed9ece1aa61d13006ecd823e445b88654b1fa0 Feb 21 21:58:33 crc kubenswrapper[4717]: I0221 21:58:33.945140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9gcb4" event={"ID":"52a5f567-3dbf-4bfa-8577-800d38f1de0d","Type":"ContainerStarted","Data":"3ef05d1bd8814429a4bfa30873ed9ece1aa61d13006ecd823e445b88654b1fa0"} Feb 21 21:58:35 crc kubenswrapper[4717]: I0221 21:58:35.962628 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9gcb4" event={"ID":"52a5f567-3dbf-4bfa-8577-800d38f1de0d","Type":"ContainerStarted","Data":"0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b"} Feb 21 21:58:35 crc kubenswrapper[4717]: I0221 21:58:35.986523 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9gcb4" podStartSLOduration=1.813685826 podStartE2EDuration="3.986490761s" podCreationTimestamp="2026-02-21 21:58:32 +0000 UTC" firstStartedPulling="2026-02-21 21:58:33.299676609 +0000 UTC m=+728.081210241" lastFinishedPulling="2026-02-21 21:58:35.472481544 +0000 UTC m=+730.254015176" observedRunningTime="2026-02-21 21:58:35.982530156 +0000 UTC m=+730.764063788" watchObservedRunningTime="2026-02-21 21:58:35.986490761 +0000 UTC m=+730.768024413" Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.100260 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9gcb4"] Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.709319 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hs5wv"] Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.711476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.746407 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hs5wv"] Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.827732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7f88\" (UniqueName: \"kubernetes.io/projected/0d953b3d-37a2-403a-bba7-369dc024f173-kube-api-access-j7f88\") pod \"openstack-operator-index-hs5wv\" (UID: \"0d953b3d-37a2-403a-bba7-369dc024f173\") " pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.929503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7f88\" (UniqueName: \"kubernetes.io/projected/0d953b3d-37a2-403a-bba7-369dc024f173-kube-api-access-j7f88\") pod \"openstack-operator-index-hs5wv\" (UID: \"0d953b3d-37a2-403a-bba7-369dc024f173\") " pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:36 crc kubenswrapper[4717]: I0221 21:58:36.966132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7f88\" (UniqueName: \"kubernetes.io/projected/0d953b3d-37a2-403a-bba7-369dc024f173-kube-api-access-j7f88\") pod \"openstack-operator-index-hs5wv\" (UID: \"0d953b3d-37a2-403a-bba7-369dc024f173\") " pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:37 crc kubenswrapper[4717]: I0221 21:58:37.062373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:37 crc kubenswrapper[4717]: I0221 21:58:37.619937 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hs5wv"] Feb 21 21:58:37 crc kubenswrapper[4717]: W0221 21:58:37.625643 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d953b3d_37a2_403a_bba7_369dc024f173.slice/crio-527aba001f33b22b6ac0ff7e50ccbf72b138cdc8d89d4246b7079aadfbe43441 WatchSource:0}: Error finding container 527aba001f33b22b6ac0ff7e50ccbf72b138cdc8d89d4246b7079aadfbe43441: Status 404 returned error can't find the container with id 527aba001f33b22b6ac0ff7e50ccbf72b138cdc8d89d4246b7079aadfbe43441 Feb 21 21:58:37 crc kubenswrapper[4717]: I0221 21:58:37.983402 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9gcb4" podUID="52a5f567-3dbf-4bfa-8577-800d38f1de0d" containerName="registry-server" containerID="cri-o://0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b" gracePeriod=2 Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.004010 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hs5wv" event={"ID":"0d953b3d-37a2-403a-bba7-369dc024f173","Type":"ContainerStarted","Data":"e542608f8157fd49996af83dd5cd035d3697bacd900f6f8b8eab2c04c4f9779d"} Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.004076 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hs5wv" event={"ID":"0d953b3d-37a2-403a-bba7-369dc024f173","Type":"ContainerStarted","Data":"527aba001f33b22b6ac0ff7e50ccbf72b138cdc8d89d4246b7079aadfbe43441"} Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.023194 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hs5wv" podStartSLOduration=1.956849728 podStartE2EDuration="2.023162596s" podCreationTimestamp="2026-02-21 21:58:36 +0000 UTC" firstStartedPulling="2026-02-21 21:58:37.631621391 +0000 UTC m=+732.413155043" lastFinishedPulling="2026-02-21 21:58:37.697934249 +0000 UTC m=+732.479467911" observedRunningTime="2026-02-21 21:58:38.015265287 +0000 UTC m=+732.796798989" watchObservedRunningTime="2026-02-21 21:58:38.023162596 +0000 UTC m=+732.804696248" Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.483202 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.553959 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27sz2\" (UniqueName: \"kubernetes.io/projected/52a5f567-3dbf-4bfa-8577-800d38f1de0d-kube-api-access-27sz2\") pod \"52a5f567-3dbf-4bfa-8577-800d38f1de0d\" (UID: \"52a5f567-3dbf-4bfa-8577-800d38f1de0d\") " Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.559054 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a5f567-3dbf-4bfa-8577-800d38f1de0d-kube-api-access-27sz2" (OuterVolumeSpecName: "kube-api-access-27sz2") pod "52a5f567-3dbf-4bfa-8577-800d38f1de0d" (UID: "52a5f567-3dbf-4bfa-8577-800d38f1de0d"). InnerVolumeSpecName "kube-api-access-27sz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.656207 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27sz2\" (UniqueName: \"kubernetes.io/projected/52a5f567-3dbf-4bfa-8577-800d38f1de0d-kube-api-access-27sz2\") on node \"crc\" DevicePath \"\"" Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.992590 4717 generic.go:334] "Generic (PLEG): container finished" podID="52a5f567-3dbf-4bfa-8577-800d38f1de0d" containerID="0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b" exitCode=0 Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.992714 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9gcb4" Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.992751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9gcb4" event={"ID":"52a5f567-3dbf-4bfa-8577-800d38f1de0d","Type":"ContainerDied","Data":"0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b"} Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.993419 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9gcb4" event={"ID":"52a5f567-3dbf-4bfa-8577-800d38f1de0d","Type":"ContainerDied","Data":"3ef05d1bd8814429a4bfa30873ed9ece1aa61d13006ecd823e445b88654b1fa0"} Feb 21 21:58:38 crc kubenswrapper[4717]: I0221 21:58:38.993473 4717 scope.go:117] "RemoveContainer" containerID="0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b" Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.023777 4717 scope.go:117] "RemoveContainer" containerID="0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b" Feb 21 21:58:39 crc kubenswrapper[4717]: E0221 21:58:39.025264 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b\": container with ID starting with 0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b not found: ID does not exist" containerID="0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b" Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.025300 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b"} err="failed to get container status \"0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b\": rpc error: code = NotFound desc = could not find container \"0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b\": container with ID starting with 0d248dc0f577b03a907475fb6de24b394524ac6a18cea2482568241a1f5f7e4b not found: ID does not exist" Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.044793 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9gcb4"] Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.052967 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9gcb4"] Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.063416 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.063506 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:58:39 crc kubenswrapper[4717]: I0221 21:58:39.989949 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a5f567-3dbf-4bfa-8577-800d38f1de0d" path="/var/lib/kubelet/pods/52a5f567-3dbf-4bfa-8577-800d38f1de0d/volumes" Feb 21 21:58:47 crc kubenswrapper[4717]: I0221 21:58:47.062507 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:47 crc kubenswrapper[4717]: I0221 21:58:47.063357 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:47 crc kubenswrapper[4717]: I0221 21:58:47.104045 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:48 crc kubenswrapper[4717]: I0221 21:58:48.106415 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hs5wv" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.106153 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8"] Feb 21 21:58:55 crc kubenswrapper[4717]: E0221 21:58:55.108797 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a5f567-3dbf-4bfa-8577-800d38f1de0d" containerName="registry-server" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.108922 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a5f567-3dbf-4bfa-8577-800d38f1de0d" containerName="registry-server" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.109135 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a5f567-3dbf-4bfa-8577-800d38f1de0d" containerName="registry-server" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.110313 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.121475 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8"] Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.122965 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qjbhs" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.246770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-bundle\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.246924 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-util\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.246985 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6v6z\" (UniqueName: \"kubernetes.io/projected/04ed380b-d424-43c9-b15c-384a60a084a0-kube-api-access-q6v6z\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.348707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-util\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.348806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6v6z\" (UniqueName: \"kubernetes.io/projected/04ed380b-d424-43c9-b15c-384a60a084a0-kube-api-access-q6v6z\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.348936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-bundle\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.349704 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-bundle\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.349809 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-util\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.381702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6v6z\" (UniqueName: \"kubernetes.io/projected/04ed380b-d424-43c9-b15c-384a60a084a0-kube-api-access-q6v6z\") pod \"6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.436132 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:58:55 crc kubenswrapper[4717]: I0221 21:58:55.754296 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8"] Feb 21 21:58:56 crc kubenswrapper[4717]: I0221 21:58:56.130242 4717 generic.go:334] "Generic (PLEG): container finished" podID="04ed380b-d424-43c9-b15c-384a60a084a0" containerID="8bb3fbb2cae55216ef730c95583a82047cc409ba7786d3bc69f2127938639cd9" exitCode=0 Feb 21 21:58:56 crc kubenswrapper[4717]: I0221 21:58:56.130303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" event={"ID":"04ed380b-d424-43c9-b15c-384a60a084a0","Type":"ContainerDied","Data":"8bb3fbb2cae55216ef730c95583a82047cc409ba7786d3bc69f2127938639cd9"} Feb 21 21:58:56 crc kubenswrapper[4717]: I0221 21:58:56.130360 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" event={"ID":"04ed380b-d424-43c9-b15c-384a60a084a0","Type":"ContainerStarted","Data":"7df0523f300f89e2c8dd8183ce3b3bbb4cad95ada7e38693d36ca8eb55aee6e8"} Feb 21 21:58:57 crc kubenswrapper[4717]: E0221 21:58:57.772382 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04ed380b_d424_43c9_b15c_384a60a084a0.slice/crio-af9cee16f9ccb123f28478479fcb23d26dcefd201e249ba1b2b5855ad5d1ef7a.scope\": RecentStats: unable to find data in memory cache]" Feb 21 21:58:58 crc kubenswrapper[4717]: I0221 21:58:58.645468 4717 generic.go:334] "Generic (PLEG): container finished" podID="04ed380b-d424-43c9-b15c-384a60a084a0" containerID="af9cee16f9ccb123f28478479fcb23d26dcefd201e249ba1b2b5855ad5d1ef7a" exitCode=0 Feb 21 21:58:58 crc kubenswrapper[4717]: I0221 21:58:58.645607 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" event={"ID":"04ed380b-d424-43c9-b15c-384a60a084a0","Type":"ContainerDied","Data":"af9cee16f9ccb123f28478479fcb23d26dcefd201e249ba1b2b5855ad5d1ef7a"} Feb 21 21:58:58 crc kubenswrapper[4717]: I0221 21:58:58.723338 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 21 21:58:59 crc kubenswrapper[4717]: I0221 21:58:59.658523 4717 generic.go:334] "Generic (PLEG): container finished" podID="04ed380b-d424-43c9-b15c-384a60a084a0" containerID="afcaf6c0bb6bf72524861a4effab333a6df0e1bc8d84681d1c96d8a24450dcd3" exitCode=0 Feb 21 21:58:59 crc kubenswrapper[4717]: I0221 21:58:59.658605 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" event={"ID":"04ed380b-d424-43c9-b15c-384a60a084a0","Type":"ContainerDied","Data":"afcaf6c0bb6bf72524861a4effab333a6df0e1bc8d84681d1c96d8a24450dcd3"} Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.005749 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.161155 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-util\") pod \"04ed380b-d424-43c9-b15c-384a60a084a0\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.161739 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-bundle\") pod \"04ed380b-d424-43c9-b15c-384a60a084a0\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.161813 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6v6z\" (UniqueName: \"kubernetes.io/projected/04ed380b-d424-43c9-b15c-384a60a084a0-kube-api-access-q6v6z\") pod \"04ed380b-d424-43c9-b15c-384a60a084a0\" (UID: \"04ed380b-d424-43c9-b15c-384a60a084a0\") " Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.172400 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-bundle" (OuterVolumeSpecName: "bundle") pod "04ed380b-d424-43c9-b15c-384a60a084a0" (UID: "04ed380b-d424-43c9-b15c-384a60a084a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.174646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ed380b-d424-43c9-b15c-384a60a084a0-kube-api-access-q6v6z" (OuterVolumeSpecName: "kube-api-access-q6v6z") pod "04ed380b-d424-43c9-b15c-384a60a084a0" (UID: "04ed380b-d424-43c9-b15c-384a60a084a0"). InnerVolumeSpecName "kube-api-access-q6v6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.263910 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.263968 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6v6z\" (UniqueName: \"kubernetes.io/projected/04ed380b-d424-43c9-b15c-384a60a084a0-kube-api-access-q6v6z\") on node \"crc\" DevicePath \"\"" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.464917 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-util" (OuterVolumeSpecName: "util") pod "04ed380b-d424-43c9-b15c-384a60a084a0" (UID: "04ed380b-d424-43c9-b15c-384a60a084a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.466964 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04ed380b-d424-43c9-b15c-384a60a084a0-util\") on node \"crc\" DevicePath \"\"" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.682584 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" event={"ID":"04ed380b-d424-43c9-b15c-384a60a084a0","Type":"ContainerDied","Data":"7df0523f300f89e2c8dd8183ce3b3bbb4cad95ada7e38693d36ca8eb55aee6e8"} Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.682642 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df0523f300f89e2c8dd8183ce3b3bbb4cad95ada7e38693d36ca8eb55aee6e8" Feb 21 21:59:01 crc kubenswrapper[4717]: I0221 21:59:01.682695 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.809153 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r"] Feb 21 21:59:07 crc kubenswrapper[4717]: E0221 21:59:07.809918 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="pull" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.809934 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="pull" Feb 21 21:59:07 crc kubenswrapper[4717]: E0221 21:59:07.809951 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="util" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.809958 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="util" Feb 21 21:59:07 crc kubenswrapper[4717]: E0221 21:59:07.809971 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="extract" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.809979 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="extract" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.810170 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ed380b-d424-43c9-b15c-384a60a084a0" containerName="extract" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.810830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.814506 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gfwb9" Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.833452 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r"] Feb 21 21:59:07 crc kubenswrapper[4717]: I0221 21:59:07.965640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2thxh\" (UniqueName: \"kubernetes.io/projected/29923aad-fe1a-464c-8f19-dc10ef9e4eaa-kube-api-access-2thxh\") pod \"openstack-operator-controller-init-5ccb695f5f-bb64r\" (UID: \"29923aad-fe1a-464c-8f19-dc10ef9e4eaa\") " pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:08 crc kubenswrapper[4717]: I0221 21:59:08.067881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2thxh\" (UniqueName: \"kubernetes.io/projected/29923aad-fe1a-464c-8f19-dc10ef9e4eaa-kube-api-access-2thxh\") pod \"openstack-operator-controller-init-5ccb695f5f-bb64r\" (UID: \"29923aad-fe1a-464c-8f19-dc10ef9e4eaa\") " pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:08 crc kubenswrapper[4717]: I0221 21:59:08.098823 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2thxh\" (UniqueName: \"kubernetes.io/projected/29923aad-fe1a-464c-8f19-dc10ef9e4eaa-kube-api-access-2thxh\") pod \"openstack-operator-controller-init-5ccb695f5f-bb64r\" (UID: \"29923aad-fe1a-464c-8f19-dc10ef9e4eaa\") " pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:08 crc kubenswrapper[4717]: I0221 21:59:08.152353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:08 crc kubenswrapper[4717]: I0221 21:59:08.453552 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r"] Feb 21 21:59:08 crc kubenswrapper[4717]: W0221 21:59:08.464593 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29923aad_fe1a_464c_8f19_dc10ef9e4eaa.slice/crio-48b9fa96860e46f97926d58fb3d21dfe35d286b726107a9f6dc3f672b0b2e859 WatchSource:0}: Error finding container 48b9fa96860e46f97926d58fb3d21dfe35d286b726107a9f6dc3f672b0b2e859: Status 404 returned error can't find the container with id 48b9fa96860e46f97926d58fb3d21dfe35d286b726107a9f6dc3f672b0b2e859 Feb 21 21:59:08 crc kubenswrapper[4717]: I0221 21:59:08.741034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" event={"ID":"29923aad-fe1a-464c-8f19-dc10ef9e4eaa","Type":"ContainerStarted","Data":"48b9fa96860e46f97926d58fb3d21dfe35d286b726107a9f6dc3f672b0b2e859"} Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.062833 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.062908 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.062957 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.063545 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38f86864c8d1bb2ef635ae7b8573c0d40328b4d39ce0f3640268f93045f23c56"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.063606 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://38f86864c8d1bb2ef635ae7b8573c0d40328b4d39ce0f3640268f93045f23c56" gracePeriod=600 Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.750554 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="38f86864c8d1bb2ef635ae7b8573c0d40328b4d39ce0f3640268f93045f23c56" exitCode=0 Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.750627 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"38f86864c8d1bb2ef635ae7b8573c0d40328b4d39ce0f3640268f93045f23c56"} Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.751130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"2d284cad9372c32723c3911aa224f8fd37b88ced35957297bd0664e6eabafd92"} Feb 21 21:59:09 crc kubenswrapper[4717]: I0221 21:59:09.751156 4717 scope.go:117] "RemoveContainer" containerID="590a8ade18d9099df0dbb922a2c22739aef34b874d1adc46c6c79c7dc49ef4a7" Feb 21 21:59:12 crc kubenswrapper[4717]: I0221 21:59:12.783485 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" event={"ID":"29923aad-fe1a-464c-8f19-dc10ef9e4eaa","Type":"ContainerStarted","Data":"e376c7d60b3dbe6c89e526adb585d437e67362bbae5219c1d3e77db5ae677565"} Feb 21 21:59:12 crc kubenswrapper[4717]: I0221 21:59:12.784196 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:12 crc kubenswrapper[4717]: I0221 21:59:12.817214 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" podStartSLOduration=2.147179515 podStartE2EDuration="5.817186417s" podCreationTimestamp="2026-02-21 21:59:07 +0000 UTC" firstStartedPulling="2026-02-21 21:59:08.467380807 +0000 UTC m=+763.248914439" lastFinishedPulling="2026-02-21 21:59:12.137387719 +0000 UTC m=+766.918921341" observedRunningTime="2026-02-21 21:59:12.808001246 +0000 UTC m=+767.589534908" watchObservedRunningTime="2026-02-21 21:59:12.817186417 +0000 UTC m=+767.598720049" Feb 21 21:59:18 crc kubenswrapper[4717]: I0221 21:59:18.156154 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5ccb695f5f-bb64r" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.434775 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-877lb"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.436112 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.437802 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6h47m" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.442512 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-877lb"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.455581 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.456739 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.460791 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-77ml7" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.492674 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.494942 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.497624 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jqqbm" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.503189 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.548762 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.549659 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.553823 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vqsnw" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.555784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.561599 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.562614 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.568431 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-txh8x" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.571937 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6j92\" (UniqueName: \"kubernetes.io/projected/c5916af5-fc6c-4473-aafd-5331043ac1d8-kube-api-access-g6j92\") pod \"cinder-operator-controller-manager-55d77d7b5c-klhlb\" (UID: \"c5916af5-fc6c-4473-aafd-5331043ac1d8\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.571981 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hth74\" (UniqueName: \"kubernetes.io/projected/14e06d52-8282-4fcd-9cec-6c29a6336057-kube-api-access-hth74\") pod \"barbican-operator-controller-manager-868647ff47-877lb\" (UID: \"14e06d52-8282-4fcd-9cec-6c29a6336057\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.572044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.583265 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.588106 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.589178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.591249 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.591788 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.595236 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.596563 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4w9bc" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.596810 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.597002 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jmbcr" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.611946 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.612722 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.616036 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.623311 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-btmcp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.629525 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.634667 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-66ssv"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.635471 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.641480 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9bfzs" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.649388 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.650114 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.655769 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-68hhw" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.660469 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.661818 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.667243 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-cltk4" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.673535 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bf6p\" (UniqueName: \"kubernetes.io/projected/8cb19ba1-4432-41e7-afee-6fccd02f8564-kube-api-access-7bf6p\") pod \"glance-operator-controller-manager-784b5bb6c5-blpgd\" (UID: \"8cb19ba1-4432-41e7-afee-6fccd02f8564\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.673614 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6j92\" (UniqueName: \"kubernetes.io/projected/c5916af5-fc6c-4473-aafd-5331043ac1d8-kube-api-access-g6j92\") pod \"cinder-operator-controller-manager-55d77d7b5c-klhlb\" (UID: \"c5916af5-fc6c-4473-aafd-5331043ac1d8\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.673659 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hth74\" (UniqueName: \"kubernetes.io/projected/14e06d52-8282-4fcd-9cec-6c29a6336057-kube-api-access-hth74\") pod \"barbican-operator-controller-manager-868647ff47-877lb\" (UID: \"14e06d52-8282-4fcd-9cec-6c29a6336057\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.673711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x844v\" (UniqueName: \"kubernetes.io/projected/4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6-kube-api-access-x844v\") pod \"designate-operator-controller-manager-6d8bf5c495-lgrct\" (UID: \"4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.673753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psltq\" (UniqueName: \"kubernetes.io/projected/1c89580c-c289-4ca5-b394-a85fa285dc30-kube-api-access-psltq\") pod \"heat-operator-controller-manager-69f49c598c-mwvrn\" (UID: \"1c89580c-c289-4ca5-b394-a85fa285dc30\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.675694 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-66ssv"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.685440 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.691908 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.712095 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-7glks"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.712797 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.722535 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.723571 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.724383 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.724447 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.725793 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c792q" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.725894 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6j92\" (UniqueName: \"kubernetes.io/projected/c5916af5-fc6c-4473-aafd-5331043ac1d8-kube-api-access-g6j92\") pod \"cinder-operator-controller-manager-55d77d7b5c-klhlb\" (UID: \"c5916af5-fc6c-4473-aafd-5331043ac1d8\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.725907 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tr8d8" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.726470 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hth74\" (UniqueName: \"kubernetes.io/projected/14e06d52-8282-4fcd-9cec-6c29a6336057-kube-api-access-hth74\") pod \"barbican-operator-controller-manager-868647ff47-877lb\" (UID: \"14e06d52-8282-4fcd-9cec-6c29a6336057\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.726533 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4wfjp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.750264 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-7glks"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.756134 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.759900 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.768371 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.769239 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.770365 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.773260 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gbfww" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.774844 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwkv\" (UniqueName: \"kubernetes.io/projected/adb80d2f-050a-47f9-afe2-46cd5876e640-kube-api-access-dfwkv\") pod \"keystone-operator-controller-manager-b4d948c87-mflb2\" (UID: \"adb80d2f-050a-47f9-afe2-46cd5876e640\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.774912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97nbt\" (UniqueName: \"kubernetes.io/projected/8c916c65-714b-4f8d-b551-c35239deab87-kube-api-access-97nbt\") pod \"manila-operator-controller-manager-67d996989d-66ssv\" (UID: \"8c916c65-714b-4f8d-b551-c35239deab87\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.774969 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x844v\" (UniqueName: \"kubernetes.io/projected/4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6-kube-api-access-x844v\") pod \"designate-operator-controller-manager-6d8bf5c495-lgrct\" (UID: \"4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775052 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psltq\" (UniqueName: \"kubernetes.io/projected/1c89580c-c289-4ca5-b394-a85fa285dc30-kube-api-access-psltq\") pod \"heat-operator-controller-manager-69f49c598c-mwvrn\" (UID: \"1c89580c-c289-4ca5-b394-a85fa285dc30\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775089 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pldz\" (UniqueName: \"kubernetes.io/projected/85ba4b92-6749-498a-b112-db89d6856988-kube-api-access-9pldz\") pod \"ironic-operator-controller-manager-554564d7fc-fkch5\" (UID: \"85ba4b92-6749-498a-b112-db89d6856988\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j86j7\" (UniqueName: \"kubernetes.io/projected/5b8a35aa-e7ad-4103-b3db-1011411811db-kube-api-access-j86j7\") pod \"horizon-operator-controller-manager-5b9b8895d5-qqrnp\" (UID: \"5b8a35aa-e7ad-4103-b3db-1011411811db\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775178 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsr45\" (UniqueName: \"kubernetes.io/projected/ff6fe6a4-86ca-4723-915d-b69be63387b6-kube-api-access-wsr45\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775227 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bf6p\" (UniqueName: \"kubernetes.io/projected/8cb19ba1-4432-41e7-afee-6fccd02f8564-kube-api-access-7bf6p\") pod \"glance-operator-controller-manager-784b5bb6c5-blpgd\" (UID: \"8cb19ba1-4432-41e7-afee-6fccd02f8564\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.775288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjf2h\" (UniqueName: \"kubernetes.io/projected/825c4fa5-a334-48b9-9ae0-583beb7e6a6b-kube-api-access-bjf2h\") pod \"mariadb-operator-controller-manager-6994f66f48-96s5h\" (UID: \"825c4fa5-a334-48b9-9ae0-583beb7e6a6b\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.778911 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.779699 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.782821 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.783564 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.784266 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rhwtq" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.788128 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.788334 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.790416 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qx579" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.806926 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.808741 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.814590 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.815392 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psltq\" (UniqueName: \"kubernetes.io/projected/1c89580c-c289-4ca5-b394-a85fa285dc30-kube-api-access-psltq\") pod \"heat-operator-controller-manager-69f49c598c-mwvrn\" (UID: \"1c89580c-c289-4ca5-b394-a85fa285dc30\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.815464 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x844v\" (UniqueName: \"kubernetes.io/projected/4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6-kube-api-access-x844v\") pod \"designate-operator-controller-manager-6d8bf5c495-lgrct\" (UID: \"4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.829535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bf6p\" (UniqueName: \"kubernetes.io/projected/8cb19ba1-4432-41e7-afee-6fccd02f8564-kube-api-access-7bf6p\") pod \"glance-operator-controller-manager-784b5bb6c5-blpgd\" (UID: \"8cb19ba1-4432-41e7-afee-6fccd02f8564\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.844731 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.849427 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.849973 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.859768 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hhqw8" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.868585 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.876442 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxfbg\" (UniqueName: \"kubernetes.io/projected/4b328324-d3f1-4de9-b5b0-fb28bd7dfedd-kube-api-access-mxfbg\") pod \"ovn-operator-controller-manager-5955d8c787-gldvl\" (UID: \"4b328324-d3f1-4de9-b5b0-fb28bd7dfedd\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.876561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjf2h\" (UniqueName: \"kubernetes.io/projected/825c4fa5-a334-48b9-9ae0-583beb7e6a6b-kube-api-access-bjf2h\") pod \"mariadb-operator-controller-manager-6994f66f48-96s5h\" (UID: \"825c4fa5-a334-48b9-9ae0-583beb7e6a6b\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.876659 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwkv\" (UniqueName: \"kubernetes.io/projected/adb80d2f-050a-47f9-afe2-46cd5876e640-kube-api-access-dfwkv\") pod \"keystone-operator-controller-manager-b4d948c87-mflb2\" (UID: \"adb80d2f-050a-47f9-afe2-46cd5876e640\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.876761 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97nbt\" (UniqueName: \"kubernetes.io/projected/8c916c65-714b-4f8d-b551-c35239deab87-kube-api-access-97nbt\") pod \"manila-operator-controller-manager-67d996989d-66ssv\" (UID: \"8c916c65-714b-4f8d-b551-c35239deab87\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.876834 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvscq\" (UniqueName: \"kubernetes.io/projected/5baed11e-00fc-4c09-8a82-fb761682244e-kube-api-access-dvscq\") pod \"neutron-operator-controller-manager-6bd4687957-vzfhh\" (UID: \"5baed11e-00fc-4c09-8a82-fb761682244e\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.876975 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.877061 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pldz\" (UniqueName: \"kubernetes.io/projected/85ba4b92-6749-498a-b112-db89d6856988-kube-api-access-9pldz\") pod \"ironic-operator-controller-manager-554564d7fc-fkch5\" (UID: \"85ba4b92-6749-498a-b112-db89d6856988\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.877145 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhrm\" (UniqueName: \"kubernetes.io/projected/d839bd2c-8b12-4d02-a6b5-0399f3ded9fd-kube-api-access-srhrm\") pod \"nova-operator-controller-manager-567668f5cf-7glks\" (UID: \"d839bd2c-8b12-4d02-a6b5-0399f3ded9fd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.877226 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j86j7\" (UniqueName: \"kubernetes.io/projected/5b8a35aa-e7ad-4103-b3db-1011411811db-kube-api-access-j86j7\") pod \"horizon-operator-controller-manager-5b9b8895d5-qqrnp\" (UID: \"5b8a35aa-e7ad-4103-b3db-1011411811db\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.877323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsr45\" (UniqueName: \"kubernetes.io/projected/ff6fe6a4-86ca-4723-915d-b69be63387b6-kube-api-access-wsr45\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.877403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfth6\" (UniqueName: \"kubernetes.io/projected/a2620c81-a9f0-4d4c-b281-5e5effb23419-kube-api-access-zfth6\") pod \"octavia-operator-controller-manager-659dc6bbfc-8mw2x\" (UID: \"a2620c81-a9f0-4d4c-b281-5e5effb23419\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 21:59:53 crc kubenswrapper[4717]: E0221 21:59:53.877978 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:53 crc kubenswrapper[4717]: E0221 21:59:53.878023 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert podName:ff6fe6a4-86ca-4723-915d-b69be63387b6 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:54.378008404 +0000 UTC m=+809.159542026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert") pod "infra-operator-controller-manager-79d975b745-pb7k5" (UID: "ff6fe6a4-86ca-4723-915d-b69be63387b6") : secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.883954 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.905366 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.909851 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97nbt\" (UniqueName: \"kubernetes.io/projected/8c916c65-714b-4f8d-b551-c35239deab87-kube-api-access-97nbt\") pod \"manila-operator-controller-manager-67d996989d-66ssv\" (UID: \"8c916c65-714b-4f8d-b551-c35239deab87\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.911533 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j86j7\" (UniqueName: \"kubernetes.io/projected/5b8a35aa-e7ad-4103-b3db-1011411811db-kube-api-access-j86j7\") pod \"horizon-operator-controller-manager-5b9b8895d5-qqrnp\" (UID: \"5b8a35aa-e7ad-4103-b3db-1011411811db\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.917164 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.922919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsr45\" (UniqueName: \"kubernetes.io/projected/ff6fe6a4-86ca-4723-915d-b69be63387b6-kube-api-access-wsr45\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.928410 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjf2h\" (UniqueName: \"kubernetes.io/projected/825c4fa5-a334-48b9-9ae0-583beb7e6a6b-kube-api-access-bjf2h\") pod \"mariadb-operator-controller-manager-6994f66f48-96s5h\" (UID: \"825c4fa5-a334-48b9-9ae0-583beb7e6a6b\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.932611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pldz\" (UniqueName: \"kubernetes.io/projected/85ba4b92-6749-498a-b112-db89d6856988-kube-api-access-9pldz\") pod \"ironic-operator-controller-manager-554564d7fc-fkch5\" (UID: \"85ba4b92-6749-498a-b112-db89d6856988\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.933467 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwkv\" (UniqueName: \"kubernetes.io/projected/adb80d2f-050a-47f9-afe2-46cd5876e640-kube-api-access-dfwkv\") pod \"keystone-operator-controller-manager-b4d948c87-mflb2\" (UID: \"adb80d2f-050a-47f9-afe2-46cd5876e640\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.939908 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.940722 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.944114 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mc4tg" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.948193 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.955110 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.972609 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms"] Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.973378 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.974429 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.977286 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lqjx8" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.985963 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brg59\" (UniqueName: \"kubernetes.io/projected/3204092b-362c-42ed-ab07-3db2d36d32e5-kube-api-access-brg59\") pod \"placement-operator-controller-manager-8497b45c89-xs5xq\" (UID: \"3204092b-362c-42ed-ab07-3db2d36d32e5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.986939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvscq\" (UniqueName: \"kubernetes.io/projected/5baed11e-00fc-4c09-8a82-fb761682244e-kube-api-access-dvscq\") pod \"neutron-operator-controller-manager-6bd4687957-vzfhh\" (UID: \"5baed11e-00fc-4c09-8a82-fb761682244e\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.986983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhrm\" (UniqueName: \"kubernetes.io/projected/d839bd2c-8b12-4d02-a6b5-0399f3ded9fd-kube-api-access-srhrm\") pod \"nova-operator-controller-manager-567668f5cf-7glks\" (UID: \"d839bd2c-8b12-4d02-a6b5-0399f3ded9fd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxngj\" (UniqueName: \"kubernetes.io/projected/97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e-kube-api-access-bxngj\") pod \"telemetry-operator-controller-manager-589c568786-6vmk8\" (UID: \"97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987053 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqkk7\" (UniqueName: \"kubernetes.io/projected/6d86a5a0-240a-4b65-af2b-6a5d91d95744-kube-api-access-hqkk7\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987235 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfth6\" (UniqueName: \"kubernetes.io/projected/a2620c81-a9f0-4d4c-b281-5e5effb23419-kube-api-access-zfth6\") pod \"octavia-operator-controller-manager-659dc6bbfc-8mw2x\" (UID: \"a2620c81-a9f0-4d4c-b281-5e5effb23419\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987278 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxfbg\" (UniqueName: \"kubernetes.io/projected/4b328324-d3f1-4de9-b5b0-fb28bd7dfedd-kube-api-access-mxfbg\") pod \"ovn-operator-controller-manager-5955d8c787-gldvl\" (UID: \"4b328324-d3f1-4de9-b5b0-fb28bd7dfedd\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987297 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4m6\" (UniqueName: \"kubernetes.io/projected/7a130695-4494-482a-b4fb-4703071fd28f-kube-api-access-kt4m6\") pod \"swift-operator-controller-manager-68f46476f-b5bzj\" (UID: \"7a130695-4494-482a-b4fb-4703071fd28f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 21:59:53 crc kubenswrapper[4717]: I0221 21:59:53.987320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwvr\" (UniqueName: \"kubernetes.io/projected/a8cafe00-f55d-4444-ae31-827ac956b47c-kube-api-access-xgwvr\") pod \"test-operator-controller-manager-5dc6794d5b-j6mms\" (UID: \"a8cafe00-f55d-4444-ae31-827ac956b47c\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:53.998542 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.026990 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhrm\" (UniqueName: \"kubernetes.io/projected/d839bd2c-8b12-4d02-a6b5-0399f3ded9fd-kube-api-access-srhrm\") pod \"nova-operator-controller-manager-567668f5cf-7glks\" (UID: \"d839bd2c-8b12-4d02-a6b5-0399f3ded9fd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.026993 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvscq\" (UniqueName: \"kubernetes.io/projected/5baed11e-00fc-4c09-8a82-fb761682244e-kube-api-access-dvscq\") pod \"neutron-operator-controller-manager-6bd4687957-vzfhh\" (UID: \"5baed11e-00fc-4c09-8a82-fb761682244e\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.033028 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.042380 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.042798 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxfbg\" (UniqueName: \"kubernetes.io/projected/4b328324-d3f1-4de9-b5b0-fb28bd7dfedd-kube-api-access-mxfbg\") pod \"ovn-operator-controller-manager-5955d8c787-gldvl\" (UID: \"4b328324-d3f1-4de9-b5b0-fb28bd7dfedd\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.043202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfth6\" (UniqueName: \"kubernetes.io/projected/a2620c81-a9f0-4d4c-b281-5e5effb23419-kube-api-access-zfth6\") pod \"octavia-operator-controller-manager-659dc6bbfc-8mw2x\" (UID: \"a2620c81-a9f0-4d4c-b281-5e5effb23419\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.063347 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.066363 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.080128 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.087011 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.088936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxngj\" (UniqueName: \"kubernetes.io/projected/97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e-kube-api-access-bxngj\") pod \"telemetry-operator-controller-manager-589c568786-6vmk8\" (UID: \"97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.088971 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqkk7\" (UniqueName: \"kubernetes.io/projected/6d86a5a0-240a-4b65-af2b-6a5d91d95744-kube-api-access-hqkk7\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.089001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.089050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4m6\" (UniqueName: \"kubernetes.io/projected/7a130695-4494-482a-b4fb-4703071fd28f-kube-api-access-kt4m6\") pod \"swift-operator-controller-manager-68f46476f-b5bzj\" (UID: \"7a130695-4494-482a-b4fb-4703071fd28f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.089073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwvr\" (UniqueName: \"kubernetes.io/projected/a8cafe00-f55d-4444-ae31-827ac956b47c-kube-api-access-xgwvr\") pod \"test-operator-controller-manager-5dc6794d5b-j6mms\" (UID: \"a8cafe00-f55d-4444-ae31-827ac956b47c\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.089096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brg59\" (UniqueName: \"kubernetes.io/projected/3204092b-362c-42ed-ab07-3db2d36d32e5-kube-api-access-brg59\") pod \"placement-operator-controller-manager-8497b45c89-xs5xq\" (UID: \"3204092b-362c-42ed-ab07-3db2d36d32e5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.090599 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.090639 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert podName:6d86a5a0-240a-4b65-af2b-6a5d91d95744 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:54.590626251 +0000 UTC m=+809.372159873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" (UID: "6d86a5a0-240a-4b65-af2b-6a5d91d95744") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.109508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brg59\" (UniqueName: \"kubernetes.io/projected/3204092b-362c-42ed-ab07-3db2d36d32e5-kube-api-access-brg59\") pod \"placement-operator-controller-manager-8497b45c89-xs5xq\" (UID: \"3204092b-362c-42ed-ab07-3db2d36d32e5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.118174 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.119220 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxngj\" (UniqueName: \"kubernetes.io/projected/97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e-kube-api-access-bxngj\") pod \"telemetry-operator-controller-manager-589c568786-6vmk8\" (UID: \"97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.120040 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.120115 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.122829 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nxn64" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.122944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwvr\" (UniqueName: \"kubernetes.io/projected/a8cafe00-f55d-4444-ae31-827ac956b47c-kube-api-access-xgwvr\") pod \"test-operator-controller-manager-5dc6794d5b-j6mms\" (UID: \"a8cafe00-f55d-4444-ae31-827ac956b47c\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.123113 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqkk7\" (UniqueName: \"kubernetes.io/projected/6d86a5a0-240a-4b65-af2b-6a5d91d95744-kube-api-access-hqkk7\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.129090 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4m6\" (UniqueName: \"kubernetes.io/projected/7a130695-4494-482a-b4fb-4703071fd28f-kube-api-access-kt4m6\") pod \"swift-operator-controller-manager-68f46476f-b5bzj\" (UID: \"7a130695-4494-482a-b4fb-4703071fd28f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.172170 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.172961 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.183708 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.183940 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.184134 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2n7fb" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.189896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5tn\" (UniqueName: \"kubernetes.io/projected/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-kube-api-access-4b5tn\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.189934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww65s\" (UniqueName: \"kubernetes.io/projected/037ce2e5-e940-4172-80f4-f3d738a9d363-kube-api-access-ww65s\") pod \"watcher-operator-controller-manager-bccc79885-thwwt\" (UID: \"037ce2e5-e940-4172-80f4-f3d738a9d363\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.189961 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.189979 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.193133 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.213523 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.231177 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.256382 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.257268 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.262036 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.264461 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.267322 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jgvmq" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.292630 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxkf\" (UniqueName: \"kubernetes.io/projected/8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d-kube-api-access-4bxkf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-thgdl\" (UID: \"8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.292696 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5tn\" (UniqueName: \"kubernetes.io/projected/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-kube-api-access-4b5tn\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.292729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww65s\" (UniqueName: \"kubernetes.io/projected/037ce2e5-e940-4172-80f4-f3d738a9d363-kube-api-access-ww65s\") pod \"watcher-operator-controller-manager-bccc79885-thwwt\" (UID: \"037ce2e5-e940-4172-80f4-f3d738a9d363\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.292763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.292781 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.292937 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.292985 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:54.792970693 +0000 UTC m=+809.574504315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.293462 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.293495 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:54.793488135 +0000 UTC m=+809.575021757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "metrics-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.312306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww65s\" (UniqueName: \"kubernetes.io/projected/037ce2e5-e940-4172-80f4-f3d738a9d363-kube-api-access-ww65s\") pod \"watcher-operator-controller-manager-bccc79885-thwwt\" (UID: \"037ce2e5-e940-4172-80f4-f3d738a9d363\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.316712 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5tn\" (UniqueName: \"kubernetes.io/projected/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-kube-api-access-4b5tn\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.317687 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.394454 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxkf\" (UniqueName: \"kubernetes.io/projected/8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d-kube-api-access-4bxkf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-thgdl\" (UID: \"8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.394578 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.394765 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.394829 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert podName:ff6fe6a4-86ca-4723-915d-b69be63387b6 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:55.394807684 +0000 UTC m=+810.176341306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert") pod "infra-operator-controller-manager-79d975b745-pb7k5" (UID: "ff6fe6a4-86ca-4723-915d-b69be63387b6") : secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.421758 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxkf\" (UniqueName: \"kubernetes.io/projected/8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d-kube-api-access-4bxkf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-thgdl\" (UID: \"8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.484250 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.596727 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.596974 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.597032 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert podName:6d86a5a0-240a-4b65-af2b-6a5d91d95744 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:55.597013232 +0000 UTC m=+810.378546854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" (UID: "6d86a5a0-240a-4b65-af2b-6a5d91d95744") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.633311 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.676313 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-877lb"] Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.798098 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: I0221 21:59:54.798153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.798284 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.798329 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.798350 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:55.79833199 +0000 UTC m=+810.579865632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "metrics-server-cert" not found Feb 21 21:59:54 crc kubenswrapper[4717]: E0221 21:59:54.798370 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:55.79835891 +0000 UTC m=+810.579892532 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.111835 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" event={"ID":"14e06d52-8282-4fcd-9cec-6c29a6336057","Type":"ContainerStarted","Data":"3d7c952f1257425801b556292429dbb6b699a0b32ddc2fa4008cd11b2840315f"} Feb 21 21:59:55 crc kubenswrapper[4717]: W0221 21:59:55.169272 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c916c65_714b_4f8d_b551_c35239deab87.slice/crio-32610901b3046d51f2f6c44676ce10fc856177a9b65dd7a30aa12fd15162121e WatchSource:0}: Error finding container 32610901b3046d51f2f6c44676ce10fc856177a9b65dd7a30aa12fd15162121e: Status 404 returned error can't find the container with id 32610901b3046d51f2f6c44676ce10fc856177a9b65dd7a30aa12fd15162121e Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.181994 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-66ssv"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.192456 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb"] Feb 21 21:59:55 crc kubenswrapper[4717]: W0221 21:59:55.209472 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadb80d2f_050a_47f9_afe2_46cd5876e640.slice/crio-1f44b2f6811a4ab1aea85810ff697fc23f5ed586857ff00676e6f1826c0387ab WatchSource:0}: Error finding container 1f44b2f6811a4ab1aea85810ff697fc23f5ed586857ff00676e6f1826c0387ab: Status 404 returned error can't find the container with id 1f44b2f6811a4ab1aea85810ff697fc23f5ed586857ff00676e6f1826c0387ab Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.217427 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.226226 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.232389 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.240548 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.246471 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.256148 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.260646 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.266581 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.273436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.295811 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.305106 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h"] Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.309316 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xgwvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-j6mms_openstack-operators(a8cafe00-f55d-4444-ae31-827ac956b47c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.309416 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxfbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5955d8c787-gldvl_openstack-operators(4b328324-d3f1-4de9-b5b0-fb28bd7dfedd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.310577 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" podUID="4b328324-d3f1-4de9-b5b0-fb28bd7dfedd" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.310635 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" podUID="a8cafe00-f55d-4444-ae31-827ac956b47c" Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.311731 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8"] Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.316552 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj"] Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.319374 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjf2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-96s5h_openstack-operators(825c4fa5-a334-48b9-9ae0-583beb7e6a6b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.320576 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" podUID="825c4fa5-a334-48b9-9ae0-583beb7e6a6b" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.327502 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kt4m6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-b5bzj_openstack-operators(7a130695-4494-482a-b4fb-4703071fd28f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.328401 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxngj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-589c568786-6vmk8_openstack-operators(97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.328774 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" podUID="7a130695-4494-482a-b4fb-4703071fd28f" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.329603 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" podUID="97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e" Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.405936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.406108 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.406158 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert podName:ff6fe6a4-86ca-4723-915d-b69be63387b6 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:57.406143493 +0000 UTC m=+812.187677115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert") pod "infra-operator-controller-manager-79d975b745-pb7k5" (UID: "ff6fe6a4-86ca-4723-915d-b69be63387b6") : secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.467633 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt"] Feb 21 21:59:55 crc kubenswrapper[4717]: W0221 21:59:55.473412 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5baed11e_00fc_4c09_8a82_fb761682244e.slice/crio-c901ee68b0ad63b3845cadfbc6731adc15be63be7d5f8113fa7be1a0046449bb WatchSource:0}: Error finding container c901ee68b0ad63b3845cadfbc6731adc15be63be7d5f8113fa7be1a0046449bb: Status 404 returned error can't find the container with id c901ee68b0ad63b3845cadfbc6731adc15be63be7d5f8113fa7be1a0046449bb Feb 21 21:59:55 crc kubenswrapper[4717]: W0221 21:59:55.474003 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037ce2e5_e940_4172_80f4_f3d738a9d363.slice/crio-63aa2d9ba549e0ded061ddcd251ad03e3d6c528d24ebbde7d25a16982a6bf7e4 WatchSource:0}: Error finding container 63aa2d9ba549e0ded061ddcd251ad03e3d6c528d24ebbde7d25a16982a6bf7e4: Status 404 returned error can't find the container with id 63aa2d9ba549e0ded061ddcd251ad03e3d6c528d24ebbde7d25a16982a6bf7e4 Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.474993 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-7glks"] Feb 21 21:59:55 crc kubenswrapper[4717]: W0221 21:59:55.478752 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c59a072_f1fc_4ef2_b9c4_f88081ea3a2d.slice/crio-f97c3ad4a077b41e54fd7f31d9924d99ead72667f65a53d315be93bc2251c848 WatchSource:0}: Error finding container f97c3ad4a077b41e54fd7f31d9924d99ead72667f65a53d315be93bc2251c848: Status 404 returned error can't find the container with id f97c3ad4a077b41e54fd7f31d9924d99ead72667f65a53d315be93bc2251c848 Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.481183 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dvscq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-vzfhh_openstack-operators(5baed11e-00fc-4c09-8a82-fb761682244e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.481904 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh"] Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.482388 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" podUID="5baed11e-00fc-4c09-8a82-fb761682244e" Feb 21 21:59:55 crc kubenswrapper[4717]: W0221 21:59:55.482705 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd839bd2c_8b12_4d02_a6b5_0399f3ded9fd.slice/crio-68bf71c0552a913e427826c1c610c8708ed0288c06cb9086f41773c2d6372015 WatchSource:0}: Error finding container 68bf71c0552a913e427826c1c610c8708ed0288c06cb9086f41773c2d6372015: Status 404 returned error can't find the container with id 68bf71c0552a913e427826c1c610c8708ed0288c06cb9086f41773c2d6372015 Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.487724 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4bxkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-thgdl_openstack-operators(8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.488665 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl"] Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.489174 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-srhrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-7glks_openstack-operators(d839bd2c-8b12-4d02-a6b5-0399f3ded9fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.490455 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" podUID="d839bd2c-8b12-4d02-a6b5-0399f3ded9fd" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.490484 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" podUID="8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d" Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.612029 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.612230 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.612327 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert podName:6d86a5a0-240a-4b65-af2b-6a5d91d95744 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:57.612301287 +0000 UTC m=+812.393834939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" (UID: "6d86a5a0-240a-4b65-af2b-6a5d91d95744") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.816738 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:55 crc kubenswrapper[4717]: I0221 21:59:55.816796 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.817083 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.817097 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.817156 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:57.817142768 +0000 UTC m=+812.598676390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "webhook-server-cert" not found Feb 21 21:59:55 crc kubenswrapper[4717]: E0221 21:59:55.817181 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 21:59:57.817162868 +0000 UTC m=+812.598696490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "metrics-server-cert" not found Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.146536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" event={"ID":"8cb19ba1-4432-41e7-afee-6fccd02f8564","Type":"ContainerStarted","Data":"c49fd99aa43e18f95c40f6922ecd8e96fc20636c67d8067ab99314df431a7447"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.148504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" event={"ID":"037ce2e5-e940-4172-80f4-f3d738a9d363","Type":"ContainerStarted","Data":"63aa2d9ba549e0ded061ddcd251ad03e3d6c528d24ebbde7d25a16982a6bf7e4"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.149976 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" event={"ID":"d839bd2c-8b12-4d02-a6b5-0399f3ded9fd","Type":"ContainerStarted","Data":"68bf71c0552a913e427826c1c610c8708ed0288c06cb9086f41773c2d6372015"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.151758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" event={"ID":"5b8a35aa-e7ad-4103-b3db-1011411811db","Type":"ContainerStarted","Data":"bb535543e934c35265c53aceb8e7807ead04db3d3f16c6f195450a82b9282b07"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.152954 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" podUID="d839bd2c-8b12-4d02-a6b5-0399f3ded9fd" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.155693 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" event={"ID":"1c89580c-c289-4ca5-b394-a85fa285dc30","Type":"ContainerStarted","Data":"394f090c46c698d5a42a560542ee2e706015e2c6bbbb3462b08c10a9ec703cb3"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.158596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" event={"ID":"825c4fa5-a334-48b9-9ae0-583beb7e6a6b","Type":"ContainerStarted","Data":"18f432ccd3d886ac8945a2745b473be35aef0fb4955354ab02855e9598936f12"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.161085 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" podUID="825c4fa5-a334-48b9-9ae0-583beb7e6a6b" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.161086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" event={"ID":"5baed11e-00fc-4c09-8a82-fb761682244e","Type":"ContainerStarted","Data":"c901ee68b0ad63b3845cadfbc6731adc15be63be7d5f8113fa7be1a0046449bb"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.162319 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" podUID="5baed11e-00fc-4c09-8a82-fb761682244e" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.163534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" event={"ID":"7a130695-4494-482a-b4fb-4703071fd28f","Type":"ContainerStarted","Data":"1b1ef1d0845015a796e5984e5c6e6d2d4f9f9158d50b2ac4d148130dcd9a3f88"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.173559 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" podUID="7a130695-4494-482a-b4fb-4703071fd28f" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.176497 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" event={"ID":"85ba4b92-6749-498a-b112-db89d6856988","Type":"ContainerStarted","Data":"cfa64f2db96f3132c64b68e996f0f71244b9139d1ca6119f90b4a9976d01ff44"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.178419 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" event={"ID":"adb80d2f-050a-47f9-afe2-46cd5876e640","Type":"ContainerStarted","Data":"1f44b2f6811a4ab1aea85810ff697fc23f5ed586857ff00676e6f1826c0387ab"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.181731 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" event={"ID":"a8cafe00-f55d-4444-ae31-827ac956b47c","Type":"ContainerStarted","Data":"ce5b48abd706ab19a756def133ec802b907452077aefd8c3fc8badd57a592ae8"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.183752 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" podUID="a8cafe00-f55d-4444-ae31-827ac956b47c" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.187077 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" event={"ID":"a2620c81-a9f0-4d4c-b281-5e5effb23419","Type":"ContainerStarted","Data":"6d7d18fd6c209d9e34c713791e5b9dd7dedd05089b1dad184256aa2d74117b61"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.189365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" event={"ID":"3204092b-362c-42ed-ab07-3db2d36d32e5","Type":"ContainerStarted","Data":"c5452d46d6e87259d12017c4bf7bb263b9fb50669942c00b52e160668f20d670"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.192767 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" event={"ID":"4b328324-d3f1-4de9-b5b0-fb28bd7dfedd","Type":"ContainerStarted","Data":"34466428e0bca9e7d64a7134df74a2aef65ed93f9e467a4ba87dfb0fc7154501"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.195834 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" podUID="4b328324-d3f1-4de9-b5b0-fb28bd7dfedd" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.204209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" event={"ID":"8c916c65-714b-4f8d-b551-c35239deab87","Type":"ContainerStarted","Data":"32610901b3046d51f2f6c44676ce10fc856177a9b65dd7a30aa12fd15162121e"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.209818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" event={"ID":"8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d","Type":"ContainerStarted","Data":"f97c3ad4a077b41e54fd7f31d9924d99ead72667f65a53d315be93bc2251c848"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.212120 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" podUID="8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d" Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.213430 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" event={"ID":"4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6","Type":"ContainerStarted","Data":"f10acda04d988cd8229c9fce8b6e0e9d3cbe0acea28e59f3535bb2409480a1fc"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.214876 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" event={"ID":"c5916af5-fc6c-4473-aafd-5331043ac1d8","Type":"ContainerStarted","Data":"b2de5c137d6de3b7fab552ec4b44fa390b68eb59a3e32e930cf749c6285a3518"} Feb 21 21:59:56 crc kubenswrapper[4717]: I0221 21:59:56.217324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" event={"ID":"97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e","Type":"ContainerStarted","Data":"e49a0d0508b17a1bd549a5d455392cdcfe0900b12eda7a92e13cc5d3c9c611ad"} Feb 21 21:59:56 crc kubenswrapper[4717]: E0221 21:59:56.218670 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" podUID="97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.237039 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" podUID="97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.237264 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" podUID="5baed11e-00fc-4c09-8a82-fb761682244e" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.237606 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" podUID="4b328324-d3f1-4de9-b5b0-fb28bd7dfedd" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.241620 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" podUID="d839bd2c-8b12-4d02-a6b5-0399f3ded9fd" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.241656 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" podUID="8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.241676 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" podUID="a8cafe00-f55d-4444-ae31-827ac956b47c" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.241699 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" podUID="825c4fa5-a334-48b9-9ae0-583beb7e6a6b" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.241710 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" podUID="7a130695-4494-482a-b4fb-4703071fd28f" Feb 21 21:59:57 crc kubenswrapper[4717]: I0221 21:59:57.453774 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.453975 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.454030 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert podName:ff6fe6a4-86ca-4723-915d-b69be63387b6 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:01.454015003 +0000 UTC m=+816.235548625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert") pod "infra-operator-controller-manager-79d975b745-pb7k5" (UID: "ff6fe6a4-86ca-4723-915d-b69be63387b6") : secret "infra-operator-webhook-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: I0221 21:59:57.660035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.667541 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.667640 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert podName:6d86a5a0-240a-4b65-af2b-6a5d91d95744 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:01.667607123 +0000 UTC m=+816.449140745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" (UID: "6d86a5a0-240a-4b65-af2b-6a5d91d95744") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.863740 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.863835 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:01.863817479 +0000 UTC m=+816.645351101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "metrics-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: I0221 21:59:57.863533 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:57 crc kubenswrapper[4717]: I0221 21:59:57.864106 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.864833 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 21 21:59:57 crc kubenswrapper[4717]: E0221 21:59:57.867422 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:01.867401284 +0000 UTC m=+816.648934906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "webhook-server-cert" not found Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.153068 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4"] Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.154454 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.156937 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.157352 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.160528 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4"] Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.206933 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqm55\" (UniqueName: \"kubernetes.io/projected/5143bc63-52d5-4480-a59a-98f8fc364e53-kube-api-access-tqm55\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.207364 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5143bc63-52d5-4480-a59a-98f8fc364e53-secret-volume\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.207436 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5143bc63-52d5-4480-a59a-98f8fc364e53-config-volume\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.308993 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5143bc63-52d5-4480-a59a-98f8fc364e53-secret-volume\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.309405 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5143bc63-52d5-4480-a59a-98f8fc364e53-config-volume\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.309591 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqm55\" (UniqueName: \"kubernetes.io/projected/5143bc63-52d5-4480-a59a-98f8fc364e53-kube-api-access-tqm55\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.310494 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5143bc63-52d5-4480-a59a-98f8fc364e53-config-volume\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.322516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5143bc63-52d5-4480-a59a-98f8fc364e53-secret-volume\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.325746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqm55\" (UniqueName: \"kubernetes.io/projected/5143bc63-52d5-4480-a59a-98f8fc364e53-kube-api-access-tqm55\") pod \"collect-profiles-29528520-whvw4\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:00 crc kubenswrapper[4717]: I0221 22:00:00.481536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:01 crc kubenswrapper[4717]: I0221 22:00:01.532923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.533123 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.533203 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert podName:ff6fe6a4-86ca-4723-915d-b69be63387b6 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:09.533183907 +0000 UTC m=+824.314717529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert") pod "infra-operator-controller-manager-79d975b745-pb7k5" (UID: "ff6fe6a4-86ca-4723-915d-b69be63387b6") : secret "infra-operator-webhook-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: I0221 22:00:01.734901 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.735048 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.735092 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert podName:6d86a5a0-240a-4b65-af2b-6a5d91d95744 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:09.735079728 +0000 UTC m=+824.516613350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" (UID: "6d86a5a0-240a-4b65-af2b-6a5d91d95744") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: I0221 22:00:01.937075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:01 crc kubenswrapper[4717]: I0221 22:00:01.937119 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.937283 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.937308 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.937373 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:09.937356168 +0000 UTC m=+824.718889790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "webhook-server-cert" not found Feb 21 22:00:01 crc kubenswrapper[4717]: E0221 22:00:01.937389 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:09.937383729 +0000 UTC m=+824.718917351 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "metrics-server-cert" not found Feb 21 22:00:07 crc kubenswrapper[4717]: E0221 22:00:07.968166 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 21 22:00:07 crc kubenswrapper[4717]: E0221 22:00:07.968951 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9pldz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-fkch5_openstack-operators(85ba4b92-6749-498a-b112-db89d6856988): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:00:07 crc kubenswrapper[4717]: E0221 22:00:07.970650 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" podUID="85ba4b92-6749-498a-b112-db89d6856988" Feb 21 22:00:08 crc kubenswrapper[4717]: E0221 22:00:08.323547 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" podUID="85ba4b92-6749-498a-b112-db89d6856988" Feb 21 22:00:08 crc kubenswrapper[4717]: E0221 22:00:08.614084 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 21 22:00:08 crc kubenswrapper[4717]: E0221 22:00:08.614342 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dfwkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-mflb2_openstack-operators(adb80d2f-050a-47f9-afe2-46cd5876e640): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:00:08 crc kubenswrapper[4717]: E0221 22:00:08.615773 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" podUID="adb80d2f-050a-47f9-afe2-46cd5876e640" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.117077 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4"] Feb 21 22:00:09 crc kubenswrapper[4717]: W0221 22:00:09.123844 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5143bc63_52d5_4480_a59a_98f8fc364e53.slice/crio-3e0a11b20ee26322884b37fe9ca548f447a062071666f7bd5faf45c7d2302ba3 WatchSource:0}: Error finding container 3e0a11b20ee26322884b37fe9ca548f447a062071666f7bd5faf45c7d2302ba3: Status 404 returned error can't find the container with id 3e0a11b20ee26322884b37fe9ca548f447a062071666f7bd5faf45c7d2302ba3 Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.328684 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" event={"ID":"5b8a35aa-e7ad-4103-b3db-1011411811db","Type":"ContainerStarted","Data":"22ec6a621123797b7e698dee9515a917000f54e05719fe6bd2f98d493a8cfbf3"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.329452 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.330854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" event={"ID":"8c916c65-714b-4f8d-b551-c35239deab87","Type":"ContainerStarted","Data":"f0a9eaf6005a69beed1f738f816c645d1a0694dc4a67e8fc908be08e957a7044"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.331038 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.332293 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" event={"ID":"8cb19ba1-4432-41e7-afee-6fccd02f8564","Type":"ContainerStarted","Data":"01d516b87d9badbba91f82af09d9583a3ff7f6cd46f797225a34a6ec292f7adf"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.332999 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.334339 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" event={"ID":"037ce2e5-e940-4172-80f4-f3d738a9d363","Type":"ContainerStarted","Data":"4122816cc970d765bddf04662fd79d189411459236b2aeff98e530b7d4a3ea06"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.334697 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.335793 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" event={"ID":"c5916af5-fc6c-4473-aafd-5331043ac1d8","Type":"ContainerStarted","Data":"0266d22a3874962fada26aeb10353a7cc24e15b22683e31d63b32fdc1b432c27"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.336235 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.338767 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" event={"ID":"a2620c81-a9f0-4d4c-b281-5e5effb23419","Type":"ContainerStarted","Data":"51e87a98b40222f4a9bf20ee6f9300685d6013069c5a82c8a3619ac85ced8a29"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.339173 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.340226 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" event={"ID":"1c89580c-c289-4ca5-b394-a85fa285dc30","Type":"ContainerStarted","Data":"2a34a29c98d2b6f907065e55acf1878bbe8825b683c28105579e18761bb0acac"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.340634 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.346961 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" event={"ID":"3204092b-362c-42ed-ab07-3db2d36d32e5","Type":"ContainerStarted","Data":"f655a2d31ec4a1e5933662299a0750e5df9c1a91a6e100b082bb8dc04dcabca1"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.347567 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.352380 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" event={"ID":"14e06d52-8282-4fcd-9cec-6c29a6336057","Type":"ContainerStarted","Data":"0365ca5895730c9f7ec75055ab8a8950ffe0a8ccc6e8e6b89ec764ed7eb0d4ee"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.352965 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.357159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" event={"ID":"5143bc63-52d5-4480-a59a-98f8fc364e53","Type":"ContainerStarted","Data":"3e0a11b20ee26322884b37fe9ca548f447a062071666f7bd5faf45c7d2302ba3"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.359571 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" event={"ID":"4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6","Type":"ContainerStarted","Data":"759033bf42396e3f4afbc31ceeeda694d4c7029c1b161195456614a60d9bc6b1"} Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.359612 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.360133 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" podUID="adb80d2f-050a-47f9-afe2-46cd5876e640" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.392450 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" podStartSLOduration=3.031197209 podStartE2EDuration="16.392431883s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.269306426 +0000 UTC m=+810.050840048" lastFinishedPulling="2026-02-21 22:00:08.6305411 +0000 UTC m=+823.412074722" observedRunningTime="2026-02-21 22:00:09.391217445 +0000 UTC m=+824.172751057" watchObservedRunningTime="2026-02-21 22:00:09.392431883 +0000 UTC m=+824.173965505" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.485749 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" podStartSLOduration=3.142392854 podStartE2EDuration="16.485730311s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.286295902 +0000 UTC m=+810.067829524" lastFinishedPulling="2026-02-21 22:00:08.629633359 +0000 UTC m=+823.411166981" observedRunningTime="2026-02-21 22:00:09.483298393 +0000 UTC m=+824.264832015" watchObservedRunningTime="2026-02-21 22:00:09.485730311 +0000 UTC m=+824.267263933" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.487884 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" podStartSLOduration=3.033855413 podStartE2EDuration="16.487877903s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.180823133 +0000 UTC m=+809.962356795" lastFinishedPulling="2026-02-21 22:00:08.634845653 +0000 UTC m=+823.416379285" observedRunningTime="2026-02-21 22:00:09.453281396 +0000 UTC m=+824.234815018" watchObservedRunningTime="2026-02-21 22:00:09.487877903 +0000 UTC m=+824.269411525" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.525819 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" podStartSLOduration=3.075751923 podStartE2EDuration="16.525803168s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.183154359 +0000 UTC m=+809.964687991" lastFinishedPulling="2026-02-21 22:00:08.633205614 +0000 UTC m=+823.414739236" observedRunningTime="2026-02-21 22:00:09.524538768 +0000 UTC m=+824.306072390" watchObservedRunningTime="2026-02-21 22:00:09.525803168 +0000 UTC m=+824.307336780" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.562696 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" podStartSLOduration=3.275350119 podStartE2EDuration="16.562681138s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.304820104 +0000 UTC m=+810.086353726" lastFinishedPulling="2026-02-21 22:00:08.592151123 +0000 UTC m=+823.373684745" observedRunningTime="2026-02-21 22:00:09.557598677 +0000 UTC m=+824.339132299" watchObservedRunningTime="2026-02-21 22:00:09.562681138 +0000 UTC m=+824.344214760" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.580642 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.580807 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.580869 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert podName:ff6fe6a4-86ca-4723-915d-b69be63387b6 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:25.580841122 +0000 UTC m=+840.362374744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert") pod "infra-operator-controller-manager-79d975b745-pb7k5" (UID: "ff6fe6a4-86ca-4723-915d-b69be63387b6") : secret "infra-operator-webhook-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.656387 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" podStartSLOduration=2.794760664 podStartE2EDuration="16.656368416s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:54.730646164 +0000 UTC m=+809.512179776" lastFinishedPulling="2026-02-21 22:00:08.592253906 +0000 UTC m=+823.373787528" observedRunningTime="2026-02-21 22:00:09.633032858 +0000 UTC m=+824.414566480" watchObservedRunningTime="2026-02-21 22:00:09.656368416 +0000 UTC m=+824.437902038" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.657155 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" podStartSLOduration=3.210272286 podStartE2EDuration="16.657150035s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.184225555 +0000 UTC m=+809.965759167" lastFinishedPulling="2026-02-21 22:00:08.631103284 +0000 UTC m=+823.412636916" observedRunningTime="2026-02-21 22:00:09.654300487 +0000 UTC m=+824.435834109" watchObservedRunningTime="2026-02-21 22:00:09.657150035 +0000 UTC m=+824.438683657" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.692030 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" podStartSLOduration=3.343976688 podStartE2EDuration="16.692000366s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.304521887 +0000 UTC m=+810.086055499" lastFinishedPulling="2026-02-21 22:00:08.652545555 +0000 UTC m=+823.434079177" observedRunningTime="2026-02-21 22:00:09.687886768 +0000 UTC m=+824.469420390" watchObservedRunningTime="2026-02-21 22:00:09.692000366 +0000 UTC m=+824.473533988" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.718616 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" podStartSLOduration=2.565572999 podStartE2EDuration="15.718603601s" podCreationTimestamp="2026-02-21 21:59:54 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.476718879 +0000 UTC m=+810.258252511" lastFinishedPulling="2026-02-21 22:00:08.629749491 +0000 UTC m=+823.411283113" observedRunningTime="2026-02-21 22:00:09.717445884 +0000 UTC m=+824.498979506" watchObservedRunningTime="2026-02-21 22:00:09.718603601 +0000 UTC m=+824.500137223" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.768703 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" podStartSLOduration=3.387227361 podStartE2EDuration="16.768680877s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.248378797 +0000 UTC m=+810.029912419" lastFinishedPulling="2026-02-21 22:00:08.629832313 +0000 UTC m=+823.411365935" observedRunningTime="2026-02-21 22:00:09.763775521 +0000 UTC m=+824.545309143" watchObservedRunningTime="2026-02-21 22:00:09.768680877 +0000 UTC m=+824.550214499" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.783513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.783719 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.783769 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert podName:6d86a5a0-240a-4b65-af2b-6a5d91d95744 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:25.783754298 +0000 UTC m=+840.565287910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" (UID: "6d86a5a0-240a-4b65-af2b-6a5d91d95744") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.986955 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:09 crc kubenswrapper[4717]: I0221 22:00:09.987089 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.987235 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.987327 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:25.987314658 +0000 UTC m=+840.768848280 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "webhook-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.988704 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 21 22:00:09 crc kubenswrapper[4717]: E0221 22:00:09.988796 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs podName:b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8 nodeName:}" failed. No retries permitted until 2026-02-21 22:00:25.988787563 +0000 UTC m=+840.770321185 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs") pod "openstack-operator-controller-manager-85dff9d968-589dj" (UID: "b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8") : secret "metrics-server-cert" not found Feb 21 22:00:12 crc kubenswrapper[4717]: I0221 22:00:12.386778 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" event={"ID":"5143bc63-52d5-4480-a59a-98f8fc364e53","Type":"ContainerStarted","Data":"5ee67d3501af8c06a273cf88a739df40b51172981e58d4472d7d3854414353cd"} Feb 21 22:00:12 crc kubenswrapper[4717]: I0221 22:00:12.408151 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" podStartSLOduration=12.408136303 podStartE2EDuration="12.408136303s" podCreationTimestamp="2026-02-21 22:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:00:12.406267529 +0000 UTC m=+827.187801171" watchObservedRunningTime="2026-02-21 22:00:12.408136303 +0000 UTC m=+827.189669925" Feb 21 22:00:13 crc kubenswrapper[4717]: I0221 22:00:13.393967 4717 generic.go:334] "Generic (PLEG): container finished" podID="5143bc63-52d5-4480-a59a-98f8fc364e53" containerID="5ee67d3501af8c06a273cf88a739df40b51172981e58d4472d7d3854414353cd" exitCode=0 Feb 21 22:00:13 crc kubenswrapper[4717]: I0221 22:00:13.394007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" event={"ID":"5143bc63-52d5-4480-a59a-98f8fc364e53","Type":"ContainerDied","Data":"5ee67d3501af8c06a273cf88a739df40b51172981e58d4472d7d3854414353cd"} Feb 21 22:00:13 crc kubenswrapper[4717]: I0221 22:00:13.784223 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-877lb" Feb 21 22:00:13 crc kubenswrapper[4717]: I0221 22:00:13.871346 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-blpgd" Feb 21 22:00:13 crc kubenswrapper[4717]: I0221 22:00:13.909526 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-mwvrn" Feb 21 22:00:13 crc kubenswrapper[4717]: I0221 22:00:13.984479 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-66ssv" Feb 21 22:00:14 crc kubenswrapper[4717]: I0221 22:00:14.088983 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-8mw2x" Feb 21 22:00:14 crc kubenswrapper[4717]: I0221 22:00:14.216446 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-xs5xq" Feb 21 22:00:14 crc kubenswrapper[4717]: I0221 22:00:14.493845 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-thwwt" Feb 21 22:00:15 crc kubenswrapper[4717]: I0221 22:00:15.895282 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.088664 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5143bc63-52d5-4480-a59a-98f8fc364e53-config-volume\") pod \"5143bc63-52d5-4480-a59a-98f8fc364e53\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.088799 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqm55\" (UniqueName: \"kubernetes.io/projected/5143bc63-52d5-4480-a59a-98f8fc364e53-kube-api-access-tqm55\") pod \"5143bc63-52d5-4480-a59a-98f8fc364e53\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.088840 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5143bc63-52d5-4480-a59a-98f8fc364e53-secret-volume\") pod \"5143bc63-52d5-4480-a59a-98f8fc364e53\" (UID: \"5143bc63-52d5-4480-a59a-98f8fc364e53\") " Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.089544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5143bc63-52d5-4480-a59a-98f8fc364e53-config-volume" (OuterVolumeSpecName: "config-volume") pod "5143bc63-52d5-4480-a59a-98f8fc364e53" (UID: "5143bc63-52d5-4480-a59a-98f8fc364e53"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.102018 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5143bc63-52d5-4480-a59a-98f8fc364e53-kube-api-access-tqm55" (OuterVolumeSpecName: "kube-api-access-tqm55") pod "5143bc63-52d5-4480-a59a-98f8fc364e53" (UID: "5143bc63-52d5-4480-a59a-98f8fc364e53"). InnerVolumeSpecName "kube-api-access-tqm55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.116030 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5143bc63-52d5-4480-a59a-98f8fc364e53-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5143bc63-52d5-4480-a59a-98f8fc364e53" (UID: "5143bc63-52d5-4480-a59a-98f8fc364e53"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.190572 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5143bc63-52d5-4480-a59a-98f8fc364e53-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.190628 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqm55\" (UniqueName: \"kubernetes.io/projected/5143bc63-52d5-4480-a59a-98f8fc364e53-kube-api-access-tqm55\") on node \"crc\" DevicePath \"\"" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.190642 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5143bc63-52d5-4480-a59a-98f8fc364e53-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.426760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" event={"ID":"5143bc63-52d5-4480-a59a-98f8fc364e53","Type":"ContainerDied","Data":"3e0a11b20ee26322884b37fe9ca548f447a062071666f7bd5faf45c7d2302ba3"} Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.426800 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e0a11b20ee26322884b37fe9ca548f447a062071666f7bd5faf45c7d2302ba3" Feb 21 22:00:16 crc kubenswrapper[4717]: I0221 22:00:16.426890 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.454016 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" event={"ID":"a8cafe00-f55d-4444-ae31-827ac956b47c","Type":"ContainerStarted","Data":"eac426c4508d1cd0363c04af9e64f01ae1a6472d0a34eb1da09ed0ee04ac9f71"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.454744 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.455491 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" event={"ID":"825c4fa5-a334-48b9-9ae0-583beb7e6a6b","Type":"ContainerStarted","Data":"5552a238c6c94011c7d18f67c9ee3ece632f6198f3b9fca418ce3b182aeff752"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.455755 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.457247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" event={"ID":"97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e","Type":"ContainerStarted","Data":"be7613e4f180ac5fac5aaa044cccb549de6d43fe48205513a43442f62f9e9933"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.457411 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.459498 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" event={"ID":"d839bd2c-8b12-4d02-a6b5-0399f3ded9fd","Type":"ContainerStarted","Data":"e34c5fd2a4602a776ea4dbe1afcc84f4f0d383217176512497dc0f7a35dd7f44"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.459676 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.461042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" event={"ID":"5baed11e-00fc-4c09-8a82-fb761682244e","Type":"ContainerStarted","Data":"cc62c53c8ec0a2af685e28915b209a5eb66370e6f4a3786ba44654aadf001ee4"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.461245 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.462536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" event={"ID":"7a130695-4494-482a-b4fb-4703071fd28f","Type":"ContainerStarted","Data":"f27d4141ee8ae320bb77fe67c33342b6e730c146930e9dfafea39198ceebaa62"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.462652 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.463887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" event={"ID":"4b328324-d3f1-4de9-b5b0-fb28bd7dfedd","Type":"ContainerStarted","Data":"174eb1889a12461faa42b1a084ca00b9565c13ee4d908763dacd277bbb0a2abd"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.464082 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.465454 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" event={"ID":"8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d","Type":"ContainerStarted","Data":"3271f0138c6bda45676951f5f24874e09ff646cf0a671e809450ef723f0b5e41"} Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.469068 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" podStartSLOduration=3.309140797 podStartE2EDuration="27.469058916s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.309201849 +0000 UTC m=+810.090735471" lastFinishedPulling="2026-02-21 22:00:19.469119968 +0000 UTC m=+834.250653590" observedRunningTime="2026-02-21 22:00:20.467343964 +0000 UTC m=+835.248877586" watchObservedRunningTime="2026-02-21 22:00:20.469058916 +0000 UTC m=+835.250592538" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.483083 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" podStartSLOduration=3.252094434 podStartE2EDuration="27.48306458s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.309360072 +0000 UTC m=+810.090893684" lastFinishedPulling="2026-02-21 22:00:19.540330208 +0000 UTC m=+834.321863830" observedRunningTime="2026-02-21 22:00:20.479189587 +0000 UTC m=+835.260723209" watchObservedRunningTime="2026-02-21 22:00:20.48306458 +0000 UTC m=+835.264598202" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.498603 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" podStartSLOduration=3.405040745 podStartE2EDuration="27.4985807s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.489057933 +0000 UTC m=+810.270591555" lastFinishedPulling="2026-02-21 22:00:19.582597878 +0000 UTC m=+834.364131510" observedRunningTime="2026-02-21 22:00:20.494553374 +0000 UTC m=+835.276087016" watchObservedRunningTime="2026-02-21 22:00:20.4985807 +0000 UTC m=+835.280114322" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.514015 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" podStartSLOduration=3.971719697 podStartE2EDuration="27.514000359s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.327399253 +0000 UTC m=+810.108932875" lastFinishedPulling="2026-02-21 22:00:18.869679925 +0000 UTC m=+833.651213537" observedRunningTime="2026-02-21 22:00:20.510499665 +0000 UTC m=+835.292033287" watchObservedRunningTime="2026-02-21 22:00:20.514000359 +0000 UTC m=+835.295533981" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.552593 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" podStartSLOduration=3.401677204 podStartE2EDuration="27.552578059s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.328316715 +0000 UTC m=+810.109850337" lastFinishedPulling="2026-02-21 22:00:19.47921758 +0000 UTC m=+834.260751192" observedRunningTime="2026-02-21 22:00:20.53336113 +0000 UTC m=+835.314894752" watchObservedRunningTime="2026-02-21 22:00:20.552578059 +0000 UTC m=+835.334111681" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.552726 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-thgdl" podStartSLOduration=2.559535574 podStartE2EDuration="26.552721673s" podCreationTimestamp="2026-02-21 21:59:54 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.484591176 +0000 UTC m=+810.266124798" lastFinishedPulling="2026-02-21 22:00:19.477777275 +0000 UTC m=+834.259310897" observedRunningTime="2026-02-21 22:00:20.546041594 +0000 UTC m=+835.327575216" watchObservedRunningTime="2026-02-21 22:00:20.552721673 +0000 UTC m=+835.334255295" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.560720 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" podStartSLOduration=3.573320784 podStartE2EDuration="27.560704393s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.48093299 +0000 UTC m=+810.262466652" lastFinishedPulling="2026-02-21 22:00:19.468316639 +0000 UTC m=+834.249850261" observedRunningTime="2026-02-21 22:00:20.559976066 +0000 UTC m=+835.341509688" watchObservedRunningTime="2026-02-21 22:00:20.560704393 +0000 UTC m=+835.342238015" Feb 21 22:00:20 crc kubenswrapper[4717]: I0221 22:00:20.577054 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" podStartSLOduration=3.356260161 podStartE2EDuration="27.577039614s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.319269059 +0000 UTC m=+810.100802681" lastFinishedPulling="2026-02-21 22:00:19.540048512 +0000 UTC m=+834.321582134" observedRunningTime="2026-02-21 22:00:20.574907602 +0000 UTC m=+835.356441224" watchObservedRunningTime="2026-02-21 22:00:20.577039614 +0000 UTC m=+835.358573236" Feb 21 22:00:21 crc kubenswrapper[4717]: I0221 22:00:21.474269 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" event={"ID":"adb80d2f-050a-47f9-afe2-46cd5876e640","Type":"ContainerStarted","Data":"759e53ce5fc1fbbf851e5101be0f89920fd8fbaf387e256a3899177b9ef33cf8"} Feb 21 22:00:21 crc kubenswrapper[4717]: I0221 22:00:21.495232 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" podStartSLOduration=3.3378274709999998 podStartE2EDuration="28.495208037s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.22882708 +0000 UTC m=+810.010360712" lastFinishedPulling="2026-02-21 22:00:20.386207656 +0000 UTC m=+835.167741278" observedRunningTime="2026-02-21 22:00:21.488822605 +0000 UTC m=+836.270356237" watchObservedRunningTime="2026-02-21 22:00:21.495208037 +0000 UTC m=+836.276741669" Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.493943 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" event={"ID":"85ba4b92-6749-498a-b112-db89d6856988","Type":"ContainerStarted","Data":"09f41cd95f520fd154249ab2dab9bb57f7e03fb05445c4efe4348258a08e5c1f"} Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.495796 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.519563 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" podStartSLOduration=3.248949478 podStartE2EDuration="30.519537315s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 21:59:55.212339336 +0000 UTC m=+809.993872958" lastFinishedPulling="2026-02-21 22:00:22.482927133 +0000 UTC m=+837.264460795" observedRunningTime="2026-02-21 22:00:23.512024006 +0000 UTC m=+838.293557668" watchObservedRunningTime="2026-02-21 22:00:23.519537315 +0000 UTC m=+838.301070957" Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.810986 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-klhlb" Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.853417 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-lgrct" Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.920362 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-qqrnp" Feb 21 22:00:23 crc kubenswrapper[4717]: I0221 22:00:23.999545 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.045192 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-96s5h" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.069765 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-7glks" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.084331 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-vzfhh" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.100044 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-gldvl" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.234200 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-b5bzj" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.268607 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-6vmk8" Feb 21 22:00:24 crc kubenswrapper[4717]: I0221 22:00:24.320732 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-j6mms" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.633982 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.645942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff6fe6a4-86ca-4723-915d-b69be63387b6-cert\") pod \"infra-operator-controller-manager-79d975b745-pb7k5\" (UID: \"ff6fe6a4-86ca-4723-915d-b69be63387b6\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.738662 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.837780 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.851841 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d86a5a0-240a-4b65-af2b-6a5d91d95744-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg\" (UID: \"6d86a5a0-240a-4b65-af2b-6a5d91d95744\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.990187 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rhwtq" Feb 21 22:00:25 crc kubenswrapper[4717]: I0221 22:00:25.996951 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.036185 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5"] Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.040446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.040485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.045154 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-metrics-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.051452 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8-webhook-certs\") pod \"openstack-operator-controller-manager-85dff9d968-589dj\" (UID: \"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8\") " pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:26 crc kubenswrapper[4717]: W0221 22:00:26.056139 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6fe6a4_86ca_4723_915d_b69be63387b6.slice/crio-fe4c645bb2b86c763c349baacf3c9813f9aa2211516e3855ae95c156c78ec3a3 WatchSource:0}: Error finding container fe4c645bb2b86c763c349baacf3c9813f9aa2211516e3855ae95c156c78ec3a3: Status 404 returned error can't find the container with id fe4c645bb2b86c763c349baacf3c9813f9aa2211516e3855ae95c156c78ec3a3 Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.206536 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg"] Feb 21 22:00:26 crc kubenswrapper[4717]: W0221 22:00:26.211227 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d86a5a0_240a_4b65_af2b_6a5d91d95744.slice/crio-b5b9891868cd4029e575209722fea109d47e8cbaf22e0b9601c24b98935487f8 WatchSource:0}: Error finding container b5b9891868cd4029e575209722fea109d47e8cbaf22e0b9601c24b98935487f8: Status 404 returned error can't find the container with id b5b9891868cd4029e575209722fea109d47e8cbaf22e0b9601c24b98935487f8 Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.305420 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2n7fb" Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.313964 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.512880 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" event={"ID":"ff6fe6a4-86ca-4723-915d-b69be63387b6","Type":"ContainerStarted","Data":"fe4c645bb2b86c763c349baacf3c9813f9aa2211516e3855ae95c156c78ec3a3"} Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.513926 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" event={"ID":"6d86a5a0-240a-4b65-af2b-6a5d91d95744","Type":"ContainerStarted","Data":"b5b9891868cd4029e575209722fea109d47e8cbaf22e0b9601c24b98935487f8"} Feb 21 22:00:26 crc kubenswrapper[4717]: I0221 22:00:26.537484 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj"] Feb 21 22:00:26 crc kubenswrapper[4717]: W0221 22:00:26.546890 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb441ea16_f9a2_4ce5_8902_6bb4bcdc18e8.slice/crio-bd6c07e8fc0b789be209518400eb4fdadbf7e7d9384b9bbd4b0e1fc101fc0da4 WatchSource:0}: Error finding container bd6c07e8fc0b789be209518400eb4fdadbf7e7d9384b9bbd4b0e1fc101fc0da4: Status 404 returned error can't find the container with id bd6c07e8fc0b789be209518400eb4fdadbf7e7d9384b9bbd4b0e1fc101fc0da4 Feb 21 22:00:27 crc kubenswrapper[4717]: I0221 22:00:27.521449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" event={"ID":"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8","Type":"ContainerStarted","Data":"310ed3c6cb2fd182523f17545720c57e76b0c201dc303930e54822ecae41fe47"} Feb 21 22:00:27 crc kubenswrapper[4717]: I0221 22:00:27.521728 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" event={"ID":"b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8","Type":"ContainerStarted","Data":"bd6c07e8fc0b789be209518400eb4fdadbf7e7d9384b9bbd4b0e1fc101fc0da4"} Feb 21 22:00:27 crc kubenswrapper[4717]: I0221 22:00:27.521777 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:27 crc kubenswrapper[4717]: I0221 22:00:27.554728 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" podStartSLOduration=33.554703658 podStartE2EDuration="33.554703658s" podCreationTimestamp="2026-02-21 21:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:00:27.549056113 +0000 UTC m=+842.330589735" watchObservedRunningTime="2026-02-21 22:00:27.554703658 +0000 UTC m=+842.336237290" Feb 21 22:00:29 crc kubenswrapper[4717]: I0221 22:00:29.540936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" event={"ID":"ff6fe6a4-86ca-4723-915d-b69be63387b6","Type":"ContainerStarted","Data":"df91db8b31a99497c388f91c5a2f3ee8518ca21c1a0ae0f18372f4c7129fe473"} Feb 21 22:00:29 crc kubenswrapper[4717]: I0221 22:00:29.542310 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:29 crc kubenswrapper[4717]: I0221 22:00:29.544031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" event={"ID":"6d86a5a0-240a-4b65-af2b-6a5d91d95744","Type":"ContainerStarted","Data":"266c2f730326050949d722f5b9446b75fb582fc9b96c159b89cdcedbe3447332"} Feb 21 22:00:29 crc kubenswrapper[4717]: I0221 22:00:29.544210 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:29 crc kubenswrapper[4717]: I0221 22:00:29.557261 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" podStartSLOduration=33.978976631 podStartE2EDuration="36.557238255s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 22:00:26.057668682 +0000 UTC m=+840.839202304" lastFinishedPulling="2026-02-21 22:00:28.635930306 +0000 UTC m=+843.417463928" observedRunningTime="2026-02-21 22:00:29.556700023 +0000 UTC m=+844.338233675" watchObservedRunningTime="2026-02-21 22:00:29.557238255 +0000 UTC m=+844.338771917" Feb 21 22:00:29 crc kubenswrapper[4717]: I0221 22:00:29.593538 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" podStartSLOduration=34.171547699 podStartE2EDuration="36.593505951s" podCreationTimestamp="2026-02-21 21:59:53 +0000 UTC" firstStartedPulling="2026-02-21 22:00:26.213292038 +0000 UTC m=+840.994825660" lastFinishedPulling="2026-02-21 22:00:28.63525029 +0000 UTC m=+843.416783912" observedRunningTime="2026-02-21 22:00:29.586549625 +0000 UTC m=+844.368083247" watchObservedRunningTime="2026-02-21 22:00:29.593505951 +0000 UTC m=+844.375039613" Feb 21 22:00:33 crc kubenswrapper[4717]: I0221 22:00:33.959259 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-fkch5" Feb 21 22:00:34 crc kubenswrapper[4717]: I0221 22:00:34.004374 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mflb2" Feb 21 22:00:35 crc kubenswrapper[4717]: I0221 22:00:35.748695 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-pb7k5" Feb 21 22:00:36 crc kubenswrapper[4717]: I0221 22:00:36.004104 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg" Feb 21 22:00:36 crc kubenswrapper[4717]: I0221 22:00:36.325026 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85dff9d968-589dj" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.573795 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4m857"] Feb 21 22:00:58 crc kubenswrapper[4717]: E0221 22:00:58.574720 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5143bc63-52d5-4480-a59a-98f8fc364e53" containerName="collect-profiles" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.574736 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5143bc63-52d5-4480-a59a-98f8fc364e53" containerName="collect-profiles" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.574923 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5143bc63-52d5-4480-a59a-98f8fc364e53" containerName="collect-profiles" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.575777 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.587886 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.588898 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.589349 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.589700 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r2l7f" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.594136 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4m857"] Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.718745 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4sj2r"] Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.719888 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4sj2r"] Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.719962 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.722987 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.736076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a97217-cc9e-41e2-8406-6e288988b05d-config\") pod \"dnsmasq-dns-675f4bcbfc-4m857\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.736117 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s679g\" (UniqueName: \"kubernetes.io/projected/71a97217-cc9e-41e2-8406-6e288988b05d-kube-api-access-s679g\") pod \"dnsmasq-dns-675f4bcbfc-4m857\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.837204 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxn2d\" (UniqueName: \"kubernetes.io/projected/03aa84f4-10f8-47b0-8d12-c9340afac6fd-kube-api-access-sxn2d\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.837605 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-config\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.837770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a97217-cc9e-41e2-8406-6e288988b05d-config\") pod \"dnsmasq-dns-675f4bcbfc-4m857\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.837925 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s679g\" (UniqueName: \"kubernetes.io/projected/71a97217-cc9e-41e2-8406-6e288988b05d-kube-api-access-s679g\") pod \"dnsmasq-dns-675f4bcbfc-4m857\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.838060 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.839246 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a97217-cc9e-41e2-8406-6e288988b05d-config\") pod \"dnsmasq-dns-675f4bcbfc-4m857\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.866667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s679g\" (UniqueName: \"kubernetes.io/projected/71a97217-cc9e-41e2-8406-6e288988b05d-kube-api-access-s679g\") pod \"dnsmasq-dns-675f4bcbfc-4m857\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.899426 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.939626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-config\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.939926 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.940051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxn2d\" (UniqueName: \"kubernetes.io/projected/03aa84f4-10f8-47b0-8d12-c9340afac6fd-kube-api-access-sxn2d\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.940655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-config\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.940692 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:58 crc kubenswrapper[4717]: I0221 22:00:58.969553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxn2d\" (UniqueName: \"kubernetes.io/projected/03aa84f4-10f8-47b0-8d12-c9340afac6fd-kube-api-access-sxn2d\") pod \"dnsmasq-dns-78dd6ddcc-4sj2r\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:59 crc kubenswrapper[4717]: I0221 22:00:59.048579 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:00:59 crc kubenswrapper[4717]: I0221 22:00:59.401436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4m857"] Feb 21 22:00:59 crc kubenswrapper[4717]: I0221 22:00:59.403599 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:00:59 crc kubenswrapper[4717]: I0221 22:00:59.509395 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4sj2r"] Feb 21 22:00:59 crc kubenswrapper[4717]: W0221 22:00:59.514586 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03aa84f4_10f8_47b0_8d12_c9340afac6fd.slice/crio-54d24dac55773967955ff63b5ee88e8f621d66c2f51fed8b22dbb5b9ce222c5e WatchSource:0}: Error finding container 54d24dac55773967955ff63b5ee88e8f621d66c2f51fed8b22dbb5b9ce222c5e: Status 404 returned error can't find the container with id 54d24dac55773967955ff63b5ee88e8f621d66c2f51fed8b22dbb5b9ce222c5e Feb 21 22:00:59 crc kubenswrapper[4717]: I0221 22:00:59.832711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" event={"ID":"03aa84f4-10f8-47b0-8d12-c9340afac6fd","Type":"ContainerStarted","Data":"54d24dac55773967955ff63b5ee88e8f621d66c2f51fed8b22dbb5b9ce222c5e"} Feb 21 22:00:59 crc kubenswrapper[4717]: I0221 22:00:59.835825 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" event={"ID":"71a97217-cc9e-41e2-8406-6e288988b05d","Type":"ContainerStarted","Data":"6181c85ccf0c70a5d2fcbaa99db75dfb47cf2667d5f6c373d5edf3906348bcf6"} Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.271592 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4m857"] Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.300113 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pt5j9"] Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.301380 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.312577 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pt5j9"] Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.386613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.386672 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7z7\" (UniqueName: \"kubernetes.io/projected/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-kube-api-access-gg7z7\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.386728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-config\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.488659 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.488733 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7z7\" (UniqueName: \"kubernetes.io/projected/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-kube-api-access-gg7z7\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.488825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-config\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.489733 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.489806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-config\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.510986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7z7\" (UniqueName: \"kubernetes.io/projected/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-kube-api-access-gg7z7\") pod \"dnsmasq-dns-666b6646f7-pt5j9\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.550301 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4sj2r"] Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.563457 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2lsdg"] Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.564711 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.576911 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2lsdg"] Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.617111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.692658 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.693106 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-config\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.693194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rvm\" (UniqueName: \"kubernetes.io/projected/f38ebc64-08ee-432f-a8af-85e84a8608ee-kube-api-access-j6rvm\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.794778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rvm\" (UniqueName: \"kubernetes.io/projected/f38ebc64-08ee-432f-a8af-85e84a8608ee-kube-api-access-j6rvm\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.794847 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.794892 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-config\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.795673 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-config\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.795757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.811621 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rvm\" (UniqueName: \"kubernetes.io/projected/f38ebc64-08ee-432f-a8af-85e84a8608ee-kube-api-access-j6rvm\") pod \"dnsmasq-dns-57d769cc4f-2lsdg\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:01 crc kubenswrapper[4717]: I0221 22:01:01.887456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.115559 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pt5j9"] Feb 21 22:01:02 crc kubenswrapper[4717]: W0221 22:01:02.124402 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c3da2c9_9230_4114_b6fd_95f79b5d7c3c.slice/crio-8ada62fff427e7a51e15f9aa598fce99398339a8f200f0a0b723248c61a456ff WatchSource:0}: Error finding container 8ada62fff427e7a51e15f9aa598fce99398339a8f200f0a0b723248c61a456ff: Status 404 returned error can't find the container with id 8ada62fff427e7a51e15f9aa598fce99398339a8f200f0a0b723248c61a456ff Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.313267 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2lsdg"] Feb 21 22:01:02 crc kubenswrapper[4717]: W0221 22:01:02.316264 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf38ebc64_08ee_432f_a8af_85e84a8608ee.slice/crio-ccb501e30feddf9dc6c9abda1d11e86678172b6462d5525551d9705a750f0c37 WatchSource:0}: Error finding container ccb501e30feddf9dc6c9abda1d11e86678172b6462d5525551d9705a750f0c37: Status 404 returned error can't find the container with id ccb501e30feddf9dc6c9abda1d11e86678172b6462d5525551d9705a750f0c37 Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.431530 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.432932 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.435048 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.435303 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.435362 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.435381 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.435569 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.435716 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dldpr" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.436060 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.436623 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611663 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611724 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611773 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611812 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611846 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5nkt\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-kube-api-access-w5nkt\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.611926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.612232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.612330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.703556 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.706111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.710667 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.711058 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.711210 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.711375 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-swqhk" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.711557 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.711699 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.711837 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713523 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713574 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713598 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713621 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713687 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713713 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713742 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5nkt\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-kube-api-access-w5nkt\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.713824 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.715052 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-config-data\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.716702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.719715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.722164 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.722685 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.722912 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.723811 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.725245 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.728688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.734244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.741144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.743775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5nkt\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-kube-api-access-w5nkt\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.756660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " pod="openstack/rabbitmq-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814531 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njnjr\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-kube-api-access-njnjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814566 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814726 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814822 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.814888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.815048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.815100 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:02 crc kubenswrapper[4717]: I0221 22:01:02.815151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000408 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njnjr\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-kube-api-access-njnjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000733 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000755 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000786 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000892 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000917 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.000951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.001987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.002281 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.002479 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.004146 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.004518 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" event={"ID":"f38ebc64-08ee-432f-a8af-85e84a8608ee","Type":"ContainerStarted","Data":"ccb501e30feddf9dc6c9abda1d11e86678172b6462d5525551d9705a750f0c37"} Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.005945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" event={"ID":"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c","Type":"ContainerStarted","Data":"8ada62fff427e7a51e15f9aa598fce99398339a8f200f0a0b723248c61a456ff"} Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.014676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.015251 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.015709 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.023725 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.028285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.028454 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.028461 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njnjr\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-kube-api-access-njnjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.036919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.056535 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.205049 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:01:03 crc kubenswrapper[4717]: I0221 22:01:03.943086 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.944279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.951561 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-v4x7c" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.952555 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.952689 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.952923 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.958056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:03.995539 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.112882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e49285f0-879f-40db-8eb9-2e8e18a87bb7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.112926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-kolla-config\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.112955 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49285f0-879f-40db-8eb9-2e8e18a87bb7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.112977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.112995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.113015 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49285f0-879f-40db-8eb9-2e8e18a87bb7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.113076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnbbr\" (UniqueName: \"kubernetes.io/projected/e49285f0-879f-40db-8eb9-2e8e18a87bb7-kube-api-access-xnbbr\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.113097 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-config-data-default\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnbbr\" (UniqueName: \"kubernetes.io/projected/e49285f0-879f-40db-8eb9-2e8e18a87bb7-kube-api-access-xnbbr\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215706 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-config-data-default\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215779 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e49285f0-879f-40db-8eb9-2e8e18a87bb7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215795 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-kolla-config\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49285f0-879f-40db-8eb9-2e8e18a87bb7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215852 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.215886 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49285f0-879f-40db-8eb9-2e8e18a87bb7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.217347 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e49285f0-879f-40db-8eb9-2e8e18a87bb7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.218362 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-config-data-default\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.218887 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-kolla-config\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.218980 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.221531 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e49285f0-879f-40db-8eb9-2e8e18a87bb7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.223562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49285f0-879f-40db-8eb9-2e8e18a87bb7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.238668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49285f0-879f-40db-8eb9-2e8e18a87bb7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.245559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnbbr\" (UniqueName: \"kubernetes.io/projected/e49285f0-879f-40db-8eb9-2e8e18a87bb7-kube-api-access-xnbbr\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.247103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e49285f0-879f-40db-8eb9-2e8e18a87bb7\") " pod="openstack/openstack-galera-0" Feb 21 22:01:04 crc kubenswrapper[4717]: I0221 22:01:04.266745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.348321 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.349671 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.355581 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.401477 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.401926 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.402105 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.402923 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c8qcc" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.533633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.533972 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.534049 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4a73e4bf-8575-43f8-bfff-35b8ca593732-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.534136 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a73e4bf-8575-43f8-bfff-35b8ca593732-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.534175 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a73e4bf-8575-43f8-bfff-35b8ca593732-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.534419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8fs\" (UniqueName: \"kubernetes.io/projected/4a73e4bf-8575-43f8-bfff-35b8ca593732-kube-api-access-wd8fs\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.534498 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.534586 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.635915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.635976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4a73e4bf-8575-43f8-bfff-35b8ca593732-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636016 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a73e4bf-8575-43f8-bfff-35b8ca593732-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a73e4bf-8575-43f8-bfff-35b8ca593732-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8fs\" (UniqueName: \"kubernetes.io/projected/4a73e4bf-8575-43f8-bfff-35b8ca593732-kube-api-access-wd8fs\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636135 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636205 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636359 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4a73e4bf-8575-43f8-bfff-35b8ca593732-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.636973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.637202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.638126 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a73e4bf-8575-43f8-bfff-35b8ca593732-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.650694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a73e4bf-8575-43f8-bfff-35b8ca593732-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.651142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a73e4bf-8575-43f8-bfff-35b8ca593732-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.659523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8fs\" (UniqueName: \"kubernetes.io/projected/4a73e4bf-8575-43f8-bfff-35b8ca593732-kube-api-access-wd8fs\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.660493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4a73e4bf-8575-43f8-bfff-35b8ca593732\") " pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.717476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.758337 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.759782 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.762602 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xzldl" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.762849 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.763171 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.774899 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.940907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85a966c3-05cc-49d0-ae99-0c774c67e89d-kolla-config\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.941490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a966c3-05cc-49d0-ae99-0c774c67e89d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.941625 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85a966c3-05cc-49d0-ae99-0c774c67e89d-config-data\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.941757 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a966c3-05cc-49d0-ae99-0c774c67e89d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:05 crc kubenswrapper[4717]: I0221 22:01:05.941897 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjzqx\" (UniqueName: \"kubernetes.io/projected/85a966c3-05cc-49d0-ae99-0c774c67e89d-kube-api-access-cjzqx\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.043248 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a966c3-05cc-49d0-ae99-0c774c67e89d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.043359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjzqx\" (UniqueName: \"kubernetes.io/projected/85a966c3-05cc-49d0-ae99-0c774c67e89d-kube-api-access-cjzqx\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.043392 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85a966c3-05cc-49d0-ae99-0c774c67e89d-kolla-config\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.043415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a966c3-05cc-49d0-ae99-0c774c67e89d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.043548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85a966c3-05cc-49d0-ae99-0c774c67e89d-config-data\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.044571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85a966c3-05cc-49d0-ae99-0c774c67e89d-config-data\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.045403 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85a966c3-05cc-49d0-ae99-0c774c67e89d-kolla-config\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.050357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a966c3-05cc-49d0-ae99-0c774c67e89d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.062159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjzqx\" (UniqueName: \"kubernetes.io/projected/85a966c3-05cc-49d0-ae99-0c774c67e89d-kube-api-access-cjzqx\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.066401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a966c3-05cc-49d0-ae99-0c774c67e89d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"85a966c3-05cc-49d0-ae99-0c774c67e89d\") " pod="openstack/memcached-0" Feb 21 22:01:06 crc kubenswrapper[4717]: I0221 22:01:06.076254 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 21 22:01:07 crc kubenswrapper[4717]: I0221 22:01:07.884302 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:01:07 crc kubenswrapper[4717]: I0221 22:01:07.885529 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 21 22:01:07 crc kubenswrapper[4717]: I0221 22:01:07.888060 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-g2dk7" Feb 21 22:01:07 crc kubenswrapper[4717]: I0221 22:01:07.897048 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:01:08 crc kubenswrapper[4717]: I0221 22:01:08.078949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbss2\" (UniqueName: \"kubernetes.io/projected/2e08bfd4-7ef2-4895-89e2-c9265d0adc13-kube-api-access-gbss2\") pod \"kube-state-metrics-0\" (UID: \"2e08bfd4-7ef2-4895-89e2-c9265d0adc13\") " pod="openstack/kube-state-metrics-0" Feb 21 22:01:08 crc kubenswrapper[4717]: I0221 22:01:08.180653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbss2\" (UniqueName: \"kubernetes.io/projected/2e08bfd4-7ef2-4895-89e2-c9265d0adc13-kube-api-access-gbss2\") pod \"kube-state-metrics-0\" (UID: \"2e08bfd4-7ef2-4895-89e2-c9265d0adc13\") " pod="openstack/kube-state-metrics-0" Feb 21 22:01:08 crc kubenswrapper[4717]: I0221 22:01:08.211176 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbss2\" (UniqueName: \"kubernetes.io/projected/2e08bfd4-7ef2-4895-89e2-c9265d0adc13-kube-api-access-gbss2\") pod \"kube-state-metrics-0\" (UID: \"2e08bfd4-7ef2-4895-89e2-c9265d0adc13\") " pod="openstack/kube-state-metrics-0" Feb 21 22:01:08 crc kubenswrapper[4717]: I0221 22:01:08.507220 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 21 22:01:09 crc kubenswrapper[4717]: I0221 22:01:09.062808 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:01:09 crc kubenswrapper[4717]: I0221 22:01:09.063145 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.186104 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-828xd"] Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.188471 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.198165 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-sls6z"] Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.199470 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-828xd"] Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.199554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.210355 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.210363 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-l599m" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.210787 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.217585 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sls6z"] Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.350974 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51572320-b28e-45be-ba55-524f9e0ccc61-scripts\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.351021 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-log\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.351046 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-run\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.351223 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwq97\" (UniqueName: \"kubernetes.io/projected/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-kube-api-access-fwq97\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-log-ovn\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352737 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-etc-ovs\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-run-ovn\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352789 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-ovn-controller-tls-certs\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-lib\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-combined-ca-bundle\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352957 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpns\" (UniqueName: \"kubernetes.io/projected/51572320-b28e-45be-ba55-524f9e0ccc61-kube-api-access-jnpns\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.352980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-run\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.353008 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-scripts\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.454238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwq97\" (UniqueName: \"kubernetes.io/projected/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-kube-api-access-fwq97\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.454323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-log-ovn\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.454349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-etc-ovs\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.454972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-log-ovn\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455014 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-etc-ovs\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-run-ovn\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-ovn-controller-tls-certs\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455196 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-run-ovn\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-lib\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455320 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-combined-ca-bundle\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-lib\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpns\" (UniqueName: \"kubernetes.io/projected/51572320-b28e-45be-ba55-524f9e0ccc61-kube-api-access-jnpns\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455535 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-run\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455578 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-scripts\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455621 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51572320-b28e-45be-ba55-524f9e0ccc61-scripts\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455665 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-log\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.455690 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-run\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.458139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-scripts\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.458274 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-var-run\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.458418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-run\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.458653 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/51572320-b28e-45be-ba55-524f9e0ccc61-var-log\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.459526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51572320-b28e-45be-ba55-524f9e0ccc61-scripts\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.463360 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-combined-ca-bundle\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.469884 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwq97\" (UniqueName: \"kubernetes.io/projected/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-kube-api-access-fwq97\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.474909 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1-ovn-controller-tls-certs\") pod \"ovn-controller-828xd\" (UID: \"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1\") " pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.481693 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpns\" (UniqueName: \"kubernetes.io/projected/51572320-b28e-45be-ba55-524f9e0ccc61-kube-api-access-jnpns\") pod \"ovn-controller-ovs-sls6z\" (UID: \"51572320-b28e-45be-ba55-524f9e0ccc61\") " pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.574769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.581201 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.596915 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.598895 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.606694 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.606802 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dv8v9" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.607062 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.607352 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.607817 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.624498 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcfdad72-66d7-4087-b0c3-4cb1925565a1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762493 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcfdad72-66d7-4087-b0c3-4cb1925565a1-config\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcfdad72-66d7-4087-b0c3-4cb1925565a1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmtz4\" (UniqueName: \"kubernetes.io/projected/dcfdad72-66d7-4087-b0c3-4cb1925565a1-kube-api-access-lmtz4\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.762735 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864034 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcfdad72-66d7-4087-b0c3-4cb1925565a1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmtz4\" (UniqueName: \"kubernetes.io/projected/dcfdad72-66d7-4087-b0c3-4cb1925565a1-kube-api-access-lmtz4\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864229 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864316 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcfdad72-66d7-4087-b0c3-4cb1925565a1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.864465 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcfdad72-66d7-4087-b0c3-4cb1925565a1-config\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.865560 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.865954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcfdad72-66d7-4087-b0c3-4cb1925565a1-config\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.868632 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcfdad72-66d7-4087-b0c3-4cb1925565a1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.869328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcfdad72-66d7-4087-b0c3-4cb1925565a1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.872698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.874422 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.880506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcfdad72-66d7-4087-b0c3-4cb1925565a1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.899506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.899761 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmtz4\" (UniqueName: \"kubernetes.io/projected/dcfdad72-66d7-4087-b0c3-4cb1925565a1-kube-api-access-lmtz4\") pod \"ovsdbserver-nb-0\" (UID: \"dcfdad72-66d7-4087-b0c3-4cb1925565a1\") " pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:11 crc kubenswrapper[4717]: I0221 22:01:11.929266 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:14 crc kubenswrapper[4717]: E0221 22:01:14.936505 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 21 22:01:14 crc kubenswrapper[4717]: E0221 22:01:14.937182 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxn2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4sj2r_openstack(03aa84f4-10f8-47b0-8d12-c9340afac6fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:01:14 crc kubenswrapper[4717]: E0221 22:01:14.938447 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" podUID="03aa84f4-10f8-47b0-8d12-c9340afac6fd" Feb 21 22:01:14 crc kubenswrapper[4717]: E0221 22:01:14.985422 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 21 22:01:14 crc kubenswrapper[4717]: E0221 22:01:14.985650 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s679g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-4m857_openstack(71a97217-cc9e-41e2-8406-6e288988b05d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:01:14 crc kubenswrapper[4717]: E0221 22:01:14.986929 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" podUID="71a97217-cc9e-41e2-8406-6e288988b05d" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.353087 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.534148 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.614290 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.615879 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.618371 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.618403 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.618653 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.618886 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zc894" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.619357 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.627609 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxn2d\" (UniqueName: \"kubernetes.io/projected/03aa84f4-10f8-47b0-8d12-c9340afac6fd-kube-api-access-sxn2d\") pod \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.627914 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-dns-svc\") pod \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.627983 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-config\") pod \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\" (UID: \"03aa84f4-10f8-47b0-8d12-c9340afac6fd\") " Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.628427 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-config" (OuterVolumeSpecName: "config") pod "03aa84f4-10f8-47b0-8d12-c9340afac6fd" (UID: "03aa84f4-10f8-47b0-8d12-c9340afac6fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.628642 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03aa84f4-10f8-47b0-8d12-c9340afac6fd" (UID: "03aa84f4-10f8-47b0-8d12-c9340afac6fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.638686 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.642209 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03aa84f4-10f8-47b0-8d12-c9340afac6fd-kube-api-access-sxn2d" (OuterVolumeSpecName: "kube-api-access-sxn2d") pod "03aa84f4-10f8-47b0-8d12-c9340afac6fd" (UID: "03aa84f4-10f8-47b0-8d12-c9340afac6fd"). InnerVolumeSpecName "kube-api-access-sxn2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:15 crc kubenswrapper[4717]: W0221 22:01:15.670764 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode49285f0_879f_40db_8eb9_2e8e18a87bb7.slice/crio-8f1831bf96bf171e939374c389d467d38ed9a9d0700279ca9975c299049c7b77 WatchSource:0}: Error finding container 8f1831bf96bf171e939374c389d467d38ed9a9d0700279ca9975c299049c7b77: Status 404 returned error can't find the container with id 8f1831bf96bf171e939374c389d467d38ed9a9d0700279ca9975c299049c7b77 Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.685221 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.690844 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.729778 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a97217-cc9e-41e2-8406-6e288988b05d-config\") pod \"71a97217-cc9e-41e2-8406-6e288988b05d\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.729877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s679g\" (UniqueName: \"kubernetes.io/projected/71a97217-cc9e-41e2-8406-6e288988b05d-kube-api-access-s679g\") pod \"71a97217-cc9e-41e2-8406-6e288988b05d\" (UID: \"71a97217-cc9e-41e2-8406-6e288988b05d\") " Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730127 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqc9\" (UniqueName: \"kubernetes.io/projected/75f38ef9-3fc9-428a-8364-96c3938d69e5-kube-api-access-mgqc9\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730155 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75f38ef9-3fc9-428a-8364-96c3938d69e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730220 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f38ef9-3fc9-428a-8364-96c3938d69e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730518 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/75f38ef9-3fc9-428a-8364-96c3938d69e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730820 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a97217-cc9e-41e2-8406-6e288988b05d-config" (OuterVolumeSpecName: "config") pod "71a97217-cc9e-41e2-8406-6e288988b05d" (UID: "71a97217-cc9e-41e2-8406-6e288988b05d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730831 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa84f4-10f8-47b0-8d12-c9340afac6fd-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.730896 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxn2d\" (UniqueName: \"kubernetes.io/projected/03aa84f4-10f8-47b0-8d12-c9340afac6fd-kube-api-access-sxn2d\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.733629 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a97217-cc9e-41e2-8406-6e288988b05d-kube-api-access-s679g" (OuterVolumeSpecName: "kube-api-access-s679g") pod "71a97217-cc9e-41e2-8406-6e288988b05d" (UID: "71a97217-cc9e-41e2-8406-6e288988b05d"). InnerVolumeSpecName "kube-api-access-s679g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.787258 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.795805 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: W0221 22:01:15.799429 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc030d9bf_a8c2_4dc0_996b_82ed1214d4bd.slice/crio-6a59bb117181333263825fa147402842572afce74fbe6a63942845cd545015c8 WatchSource:0}: Error finding container 6a59bb117181333263825fa147402842572afce74fbe6a63942845cd545015c8: Status 404 returned error can't find the container with id 6a59bb117181333263825fa147402842572afce74fbe6a63942845cd545015c8 Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.815462 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.831843 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/75f38ef9-3fc9-428a-8364-96c3938d69e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.831905 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.831927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqc9\" (UniqueName: \"kubernetes.io/projected/75f38ef9-3fc9-428a-8364-96c3938d69e5-kube-api-access-mgqc9\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.831946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75f38ef9-3fc9-428a-8364-96c3938d69e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832006 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f38ef9-3fc9-428a-8364-96c3938d69e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832095 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832147 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a97217-cc9e-41e2-8406-6e288988b05d-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832160 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s679g\" (UniqueName: \"kubernetes.io/projected/71a97217-cc9e-41e2-8406-6e288988b05d-kube-api-access-s679g\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832297 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/75f38ef9-3fc9-428a-8364-96c3938d69e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.832899 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.833817 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f38ef9-3fc9-428a-8364-96c3938d69e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.834433 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75f38ef9-3fc9-428a-8364-96c3938d69e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.837151 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.847259 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.847599 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/75f38ef9-3fc9-428a-8364-96c3938d69e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.852158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqc9\" (UniqueName: \"kubernetes.io/projected/75f38ef9-3fc9-428a-8364-96c3938d69e5-kube-api-access-mgqc9\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.864531 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"75f38ef9-3fc9-428a-8364-96c3938d69e5\") " pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.901688 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 21 22:01:15 crc kubenswrapper[4717]: W0221 22:01:15.905142 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcfdad72_66d7_4087_b0c3_4cb1925565a1.slice/crio-911c752ac407978f907245852f3682da18ee1ef3b353eb68e29a581dce2a8975 WatchSource:0}: Error finding container 911c752ac407978f907245852f3682da18ee1ef3b353eb68e29a581dce2a8975: Status 404 returned error can't find the container with id 911c752ac407978f907245852f3682da18ee1ef3b353eb68e29a581dce2a8975 Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.943731 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:15 crc kubenswrapper[4717]: I0221 22:01:15.967609 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-828xd"] Feb 21 22:01:15 crc kubenswrapper[4717]: W0221 22:01:15.976047 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ca26e0c_e312_4c2c_9e4f_89d8b15b81a1.slice/crio-5a00abaf9778de57e15a1bfeba8e89283c8767348f2c2e5b3cdc4d9043f38f40 WatchSource:0}: Error finding container 5a00abaf9778de57e15a1bfeba8e89283c8767348f2c2e5b3cdc4d9043f38f40: Status 404 returned error can't find the container with id 5a00abaf9778de57e15a1bfeba8e89283c8767348f2c2e5b3cdc4d9043f38f40 Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.016638 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sls6z"] Feb 21 22:01:16 crc kubenswrapper[4717]: W0221 22:01:16.028546 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51572320_b28e_45be_ba55_524f9e0ccc61.slice/crio-c5d99ec5e7b3fb5bfd52e7fbd9221f54aaf965e147bcdcd4fd31f578f60b7a05 WatchSource:0}: Error finding container c5d99ec5e7b3fb5bfd52e7fbd9221f54aaf965e147bcdcd4fd31f578f60b7a05: Status 404 returned error can't find the container with id c5d99ec5e7b3fb5bfd52e7fbd9221f54aaf965e147bcdcd4fd31f578f60b7a05 Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.123785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" event={"ID":"71a97217-cc9e-41e2-8406-6e288988b05d","Type":"ContainerDied","Data":"6181c85ccf0c70a5d2fcbaa99db75dfb47cf2667d5f6c373d5edf3906348bcf6"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.123906 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4m857" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.128335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4a73e4bf-8575-43f8-bfff-35b8ca593732","Type":"ContainerStarted","Data":"13e60035d5ea49d86bd107e0899e0efc292a9bf64f66545e22b15da2b9f5604d"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.129144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sls6z" event={"ID":"51572320-b28e-45be-ba55-524f9e0ccc61","Type":"ContainerStarted","Data":"c5d99ec5e7b3fb5bfd52e7fbd9221f54aaf965e147bcdcd4fd31f578f60b7a05"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.131446 4717 generic.go:334] "Generic (PLEG): container finished" podID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerID="08e8fa3e47f6c2978b50871c77445fd2e6f73c19950f83b98597d9278b5149cf" exitCode=0 Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.131644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" event={"ID":"f38ebc64-08ee-432f-a8af-85e84a8608ee","Type":"ContainerDied","Data":"08e8fa3e47f6c2978b50871c77445fd2e6f73c19950f83b98597d9278b5149cf"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.133319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2e08bfd4-7ef2-4895-89e2-c9265d0adc13","Type":"ContainerStarted","Data":"e8757ffb32c00b8d106e292ccb847bf0bb61e5ec5d11f91816df850941bdd62a"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.142016 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e49285f0-879f-40db-8eb9-2e8e18a87bb7","Type":"ContainerStarted","Data":"8f1831bf96bf171e939374c389d467d38ed9a9d0700279ca9975c299049c7b77"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.143677 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dcfdad72-66d7-4087-b0c3-4cb1925565a1","Type":"ContainerStarted","Data":"911c752ac407978f907245852f3682da18ee1ef3b353eb68e29a581dce2a8975"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.144845 4717 generic.go:334] "Generic (PLEG): container finished" podID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerID="3331edf372535cf61883a61d01b42b9a89ec9a06902ca968c82f202561dd7b22" exitCode=0 Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.144905 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" event={"ID":"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c","Type":"ContainerDied","Data":"3331edf372535cf61883a61d01b42b9a89ec9a06902ca968c82f202561dd7b22"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.162392 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" event={"ID":"03aa84f4-10f8-47b0-8d12-c9340afac6fd","Type":"ContainerDied","Data":"54d24dac55773967955ff63b5ee88e8f621d66c2f51fed8b22dbb5b9ce222c5e"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.162567 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4sj2r" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.174824 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd" event={"ID":"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1","Type":"ContainerStarted","Data":"5a00abaf9778de57e15a1bfeba8e89283c8767348f2c2e5b3cdc4d9043f38f40"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.175924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd","Type":"ContainerStarted","Data":"6a59bb117181333263825fa147402842572afce74fbe6a63942845cd545015c8"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.180757 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2400f71f-f7db-4ed8-83aa-8427afd4dcd5","Type":"ContainerStarted","Data":"7a499f94393ed90369dc531875aa72cbf0408202b6091dde870184a9ffdb964b"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.220663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"85a966c3-05cc-49d0-ae99-0c774c67e89d","Type":"ContainerStarted","Data":"675edcb425eb1a7d57f58fd7493f71d71a7d3fe233f96524e8bd7d31a01ab983"} Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.233206 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4m857"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.265716 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4m857"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.323201 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4sj2r"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.328628 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4sj2r"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.397766 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kllk6"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.403144 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.409754 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.418680 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kllk6"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.543565 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2lsdg"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.550362 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-config\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.550399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlqml\" (UniqueName: \"kubernetes.io/projected/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-kube-api-access-tlqml\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.550425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-ovn-rundir\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.550489 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-ovs-rundir\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.550520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-combined-ca-bundle\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.550587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.578675 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ktd86"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.579878 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.582438 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.595915 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ktd86"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.618303 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.652440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-ovs-rundir\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.652512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-combined-ca-bundle\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.652586 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.652676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-config\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.652707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlqml\" (UniqueName: \"kubernetes.io/projected/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-kube-api-access-tlqml\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.652734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-ovn-rundir\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.653103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-ovn-rundir\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.654061 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-config\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.654239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-ovs-rundir\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.659601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-combined-ca-bundle\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.661984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.667877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlqml\" (UniqueName: \"kubernetes.io/projected/d770b9d0-2378-4b10-bf5a-b7e91c6b3843-kube-api-access-tlqml\") pod \"ovn-controller-metrics-kllk6\" (UID: \"d770b9d0-2378-4b10-bf5a-b7e91c6b3843\") " pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.732668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kllk6" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.754524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hh6q\" (UniqueName: \"kubernetes.io/projected/8b22bbd0-3a57-4403-98f2-8781f50f68da-kube-api-access-8hh6q\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.754909 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-config\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.755224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.755372 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.825799 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pt5j9"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.852583 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s78gk"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.853718 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.856682 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.857895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hh6q\" (UniqueName: \"kubernetes.io/projected/8b22bbd0-3a57-4403-98f2-8781f50f68da-kube-api-access-8hh6q\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.857947 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-config\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.857978 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.858046 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.858793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.859529 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-config\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.860014 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.884316 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hh6q\" (UniqueName: \"kubernetes.io/projected/8b22bbd0-3a57-4403-98f2-8781f50f68da-kube-api-access-8hh6q\") pod \"dnsmasq-dns-7fd796d7df-ktd86\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.885409 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s78gk"] Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.911850 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.959758 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.959814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-config\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.959832 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.959927 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:16 crc kubenswrapper[4717]: I0221 22:01:16.960142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhm9\" (UniqueName: \"kubernetes.io/projected/192e0436-8d26-476c-b6fc-d2d43f26dc1a-kube-api-access-nfhm9\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.062113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.062192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfhm9\" (UniqueName: \"kubernetes.io/projected/192e0436-8d26-476c-b6fc-d2d43f26dc1a-kube-api-access-nfhm9\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.062264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.062293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-config\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.062312 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.063932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.064649 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.064721 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-config\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.066229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.101894 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfhm9\" (UniqueName: \"kubernetes.io/projected/192e0436-8d26-476c-b6fc-d2d43f26dc1a-kube-api-access-nfhm9\") pod \"dnsmasq-dns-86db49b7ff-s78gk\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.186340 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.297195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" event={"ID":"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c","Type":"ContainerStarted","Data":"801c9be46353848d2e3d669e9e3e6878f8fb09527ea2de949863b96dba000159"} Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.297542 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.305718 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kllk6"] Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.306481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" event={"ID":"f38ebc64-08ee-432f-a8af-85e84a8608ee","Type":"ContainerStarted","Data":"92ea52502c55b3418e3daf2e7a1fc08315917eae071db2a40853673409bb24bb"} Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.307172 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.314539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"75f38ef9-3fc9-428a-8364-96c3938d69e5","Type":"ContainerStarted","Data":"0a69a946e2d63197902445e9d847cfb02cebd96aeb8abe85ebd245e3581117df"} Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.315081 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" podStartSLOduration=3.345889863 podStartE2EDuration="16.315071144s" podCreationTimestamp="2026-02-21 22:01:01 +0000 UTC" firstStartedPulling="2026-02-21 22:01:02.127505341 +0000 UTC m=+876.909038963" lastFinishedPulling="2026-02-21 22:01:15.096686622 +0000 UTC m=+889.878220244" observedRunningTime="2026-02-21 22:01:17.314848889 +0000 UTC m=+892.096382511" watchObservedRunningTime="2026-02-21 22:01:17.315071144 +0000 UTC m=+892.096604756" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.346982 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" podStartSLOduration=3.5631850099999998 podStartE2EDuration="16.346941245s" podCreationTimestamp="2026-02-21 22:01:01 +0000 UTC" firstStartedPulling="2026-02-21 22:01:02.319771321 +0000 UTC m=+877.101304933" lastFinishedPulling="2026-02-21 22:01:15.103527546 +0000 UTC m=+889.885061168" observedRunningTime="2026-02-21 22:01:17.330648306 +0000 UTC m=+892.112181928" watchObservedRunningTime="2026-02-21 22:01:17.346941245 +0000 UTC m=+892.128474877" Feb 21 22:01:17 crc kubenswrapper[4717]: W0221 22:01:17.424662 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd770b9d0_2378_4b10_bf5a_b7e91c6b3843.slice/crio-4ba58c8ab16a9585a3cab4eb2db025893518c5a876d5e5a517b51694c97aa2e7 WatchSource:0}: Error finding container 4ba58c8ab16a9585a3cab4eb2db025893518c5a876d5e5a517b51694c97aa2e7: Status 404 returned error can't find the container with id 4ba58c8ab16a9585a3cab4eb2db025893518c5a876d5e5a517b51694c97aa2e7 Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.841022 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ktd86"] Feb 21 22:01:17 crc kubenswrapper[4717]: W0221 22:01:17.850558 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b22bbd0_3a57_4403_98f2_8781f50f68da.slice/crio-8bfbfccbb9c2accd711188f07f858df2d0cc5319f5152cdedc674987fbfc52e2 WatchSource:0}: Error finding container 8bfbfccbb9c2accd711188f07f858df2d0cc5319f5152cdedc674987fbfc52e2: Status 404 returned error can't find the container with id 8bfbfccbb9c2accd711188f07f858df2d0cc5319f5152cdedc674987fbfc52e2 Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.991425 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03aa84f4-10f8-47b0-8d12-c9340afac6fd" path="/var/lib/kubelet/pods/03aa84f4-10f8-47b0-8d12-c9340afac6fd/volumes" Feb 21 22:01:17 crc kubenswrapper[4717]: I0221 22:01:17.992167 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a97217-cc9e-41e2-8406-6e288988b05d" path="/var/lib/kubelet/pods/71a97217-cc9e-41e2-8406-6e288988b05d/volumes" Feb 21 22:01:18 crc kubenswrapper[4717]: I0221 22:01:18.324309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" event={"ID":"8b22bbd0-3a57-4403-98f2-8781f50f68da","Type":"ContainerStarted","Data":"8bfbfccbb9c2accd711188f07f858df2d0cc5319f5152cdedc674987fbfc52e2"} Feb 21 22:01:18 crc kubenswrapper[4717]: I0221 22:01:18.325471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kllk6" event={"ID":"d770b9d0-2378-4b10-bf5a-b7e91c6b3843","Type":"ContainerStarted","Data":"4ba58c8ab16a9585a3cab4eb2db025893518c5a876d5e5a517b51694c97aa2e7"} Feb 21 22:01:18 crc kubenswrapper[4717]: I0221 22:01:18.325648 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="dnsmasq-dns" containerID="cri-o://801c9be46353848d2e3d669e9e3e6878f8fb09527ea2de949863b96dba000159" gracePeriod=10 Feb 21 22:01:18 crc kubenswrapper[4717]: I0221 22:01:18.325799 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="dnsmasq-dns" containerID="cri-o://92ea52502c55b3418e3daf2e7a1fc08315917eae071db2a40853673409bb24bb" gracePeriod=10 Feb 21 22:01:19 crc kubenswrapper[4717]: I0221 22:01:19.335514 4717 generic.go:334] "Generic (PLEG): container finished" podID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerID="92ea52502c55b3418e3daf2e7a1fc08315917eae071db2a40853673409bb24bb" exitCode=0 Feb 21 22:01:19 crc kubenswrapper[4717]: I0221 22:01:19.335835 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" event={"ID":"f38ebc64-08ee-432f-a8af-85e84a8608ee","Type":"ContainerDied","Data":"92ea52502c55b3418e3daf2e7a1fc08315917eae071db2a40853673409bb24bb"} Feb 21 22:01:19 crc kubenswrapper[4717]: I0221 22:01:19.351298 4717 generic.go:334] "Generic (PLEG): container finished" podID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerID="801c9be46353848d2e3d669e9e3e6878f8fb09527ea2de949863b96dba000159" exitCode=0 Feb 21 22:01:19 crc kubenswrapper[4717]: I0221 22:01:19.351369 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" event={"ID":"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c","Type":"ContainerDied","Data":"801c9be46353848d2e3d669e9e3e6878f8fb09527ea2de949863b96dba000159"} Feb 21 22:01:22 crc kubenswrapper[4717]: I0221 22:01:22.041261 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s78gk"] Feb 21 22:01:23 crc kubenswrapper[4717]: W0221 22:01:23.585361 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod192e0436_8d26_476c_b6fc_d2d43f26dc1a.slice/crio-cce40c0b6d820fb759bd591d21498f4d33f0125add8b740f25555a0d58618d7f WatchSource:0}: Error finding container cce40c0b6d820fb759bd591d21498f4d33f0125add8b740f25555a0d58618d7f: Status 404 returned error can't find the container with id cce40c0b6d820fb759bd591d21498f4d33f0125add8b740f25555a0d58618d7f Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.693037 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.694264 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg7z7\" (UniqueName: \"kubernetes.io/projected/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-kube-api-access-gg7z7\") pod \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.694461 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-config\") pod \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.694511 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-dns-svc\") pod \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\" (UID: \"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c\") " Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.698436 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.701805 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-kube-api-access-gg7z7" (OuterVolumeSpecName: "kube-api-access-gg7z7") pod "9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" (UID: "9c3da2c9-9230-4114-b6fd-95f79b5d7c3c"). InnerVolumeSpecName "kube-api-access-gg7z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.770683 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-config" (OuterVolumeSpecName: "config") pod "9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" (UID: "9c3da2c9-9230-4114-b6fd-95f79b5d7c3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.776350 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" (UID: "9c3da2c9-9230-4114-b6fd-95f79b5d7c3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.795948 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg7z7\" (UniqueName: \"kubernetes.io/projected/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-kube-api-access-gg7z7\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.795973 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.795982 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.896436 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-dns-svc\") pod \"f38ebc64-08ee-432f-a8af-85e84a8608ee\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.896520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-config\") pod \"f38ebc64-08ee-432f-a8af-85e84a8608ee\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.896551 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rvm\" (UniqueName: \"kubernetes.io/projected/f38ebc64-08ee-432f-a8af-85e84a8608ee-kube-api-access-j6rvm\") pod \"f38ebc64-08ee-432f-a8af-85e84a8608ee\" (UID: \"f38ebc64-08ee-432f-a8af-85e84a8608ee\") " Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.900184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38ebc64-08ee-432f-a8af-85e84a8608ee-kube-api-access-j6rvm" (OuterVolumeSpecName: "kube-api-access-j6rvm") pod "f38ebc64-08ee-432f-a8af-85e84a8608ee" (UID: "f38ebc64-08ee-432f-a8af-85e84a8608ee"). InnerVolumeSpecName "kube-api-access-j6rvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.927297 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-config" (OuterVolumeSpecName: "config") pod "f38ebc64-08ee-432f-a8af-85e84a8608ee" (UID: "f38ebc64-08ee-432f-a8af-85e84a8608ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.935611 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f38ebc64-08ee-432f-a8af-85e84a8608ee" (UID: "f38ebc64-08ee-432f-a8af-85e84a8608ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.998298 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.998347 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f38ebc64-08ee-432f-a8af-85e84a8608ee-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:23 crc kubenswrapper[4717]: I0221 22:01:23.998366 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rvm\" (UniqueName: \"kubernetes.io/projected/f38ebc64-08ee-432f-a8af-85e84a8608ee-kube-api-access-j6rvm\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.388686 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" event={"ID":"9c3da2c9-9230-4114-b6fd-95f79b5d7c3c","Type":"ContainerDied","Data":"8ada62fff427e7a51e15f9aa598fce99398339a8f200f0a0b723248c61a456ff"} Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.388736 4717 scope.go:117] "RemoveContainer" containerID="801c9be46353848d2e3d669e9e3e6878f8fb09527ea2de949863b96dba000159" Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.388733 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.393276 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" event={"ID":"f38ebc64-08ee-432f-a8af-85e84a8608ee","Type":"ContainerDied","Data":"ccb501e30feddf9dc6c9abda1d11e86678172b6462d5525551d9705a750f0c37"} Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.393317 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.394912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" event={"ID":"192e0436-8d26-476c-b6fc-d2d43f26dc1a","Type":"ContainerStarted","Data":"cce40c0b6d820fb759bd591d21498f4d33f0125add8b740f25555a0d58618d7f"} Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.417091 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pt5j9"] Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.439506 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pt5j9"] Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.444832 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2lsdg"] Feb 21 22:01:24 crc kubenswrapper[4717]: I0221 22:01:24.448930 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2lsdg"] Feb 21 22:01:25 crc kubenswrapper[4717]: I0221 22:01:25.375696 4717 scope.go:117] "RemoveContainer" containerID="3331edf372535cf61883a61d01b42b9a89ec9a06902ca968c82f202561dd7b22" Feb 21 22:01:25 crc kubenswrapper[4717]: I0221 22:01:25.699337 4717 scope.go:117] "RemoveContainer" containerID="92ea52502c55b3418e3daf2e7a1fc08315917eae071db2a40853673409bb24bb" Feb 21 22:01:25 crc kubenswrapper[4717]: I0221 22:01:25.776779 4717 scope.go:117] "RemoveContainer" containerID="08e8fa3e47f6c2978b50871c77445fd2e6f73c19950f83b98597d9278b5149cf" Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:25.999159 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" path="/var/lib/kubelet/pods/9c3da2c9-9230-4114-b6fd-95f79b5d7c3c/volumes" Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.001037 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" path="/var/lib/kubelet/pods/f38ebc64-08ee-432f-a8af-85e84a8608ee/volumes" Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.415324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"85a966c3-05cc-49d0-ae99-0c774c67e89d","Type":"ContainerStarted","Data":"cac68526666a63b58e11acf56de601c8ee4a645f29871af49a5ca36ea50b60de"} Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.416425 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.422839 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerID="821690f2ead9f6949b5443f1437473d6780c33184697b10d3b441059c3cbf2f3" exitCode=0 Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.422981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" event={"ID":"8b22bbd0-3a57-4403-98f2-8781f50f68da","Type":"ContainerDied","Data":"821690f2ead9f6949b5443f1437473d6780c33184697b10d3b441059c3cbf2f3"} Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.427213 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e49285f0-879f-40db-8eb9-2e8e18a87bb7","Type":"ContainerStarted","Data":"fa3b40a4b90039a82ff3830277b2df2aa64b9457852fdc4a4d6ad7fd8b2e9cd3"} Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.451542 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.546527109 podStartE2EDuration="21.451518287s" podCreationTimestamp="2026-02-21 22:01:05 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.680693837 +0000 UTC m=+890.462227459" lastFinishedPulling="2026-02-21 22:01:23.585685015 +0000 UTC m=+898.367218637" observedRunningTime="2026-02-21 22:01:26.444069519 +0000 UTC m=+901.225603181" watchObservedRunningTime="2026-02-21 22:01:26.451518287 +0000 UTC m=+901.233051959" Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.618211 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-pt5j9" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.93:5353: i/o timeout" Feb 21 22:01:26 crc kubenswrapper[4717]: I0221 22:01:26.888628 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-2lsdg" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.94:5353: i/o timeout" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.445665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dcfdad72-66d7-4087-b0c3-4cb1925565a1","Type":"ContainerStarted","Data":"7e3415c77a6de1fc0759a80b3cc9641d214f02f492879dbf33c3750bf43f19e7"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.445972 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dcfdad72-66d7-4087-b0c3-4cb1925565a1","Type":"ContainerStarted","Data":"fb33266df7b1bb02fe7b3e11cdfc12e6507a5c85775d9e36eeac5512d7c9a257"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.448532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4a73e4bf-8575-43f8-bfff-35b8ca593732","Type":"ContainerStarted","Data":"df6b5625c8f2d63c0cc9451de169f2392535e259f79ea3f3e9a5481ac21af337"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.451575 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd" event={"ID":"4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1","Type":"ContainerStarted","Data":"248545a82baec8d99800c6cdcaad9933cc3010ac3474e252a1e02823e518930b"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.451740 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-828xd" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.454026 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd","Type":"ContainerStarted","Data":"077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.456499 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kllk6" event={"ID":"d770b9d0-2378-4b10-bf5a-b7e91c6b3843","Type":"ContainerStarted","Data":"4abbc1bd8dd95ecf4aade80547fe077bdfce8238a138dd3e3550cd5d03161c3a"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.458495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2400f71f-f7db-4ed8-83aa-8427afd4dcd5","Type":"ContainerStarted","Data":"eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.460989 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"75f38ef9-3fc9-428a-8364-96c3938d69e5","Type":"ContainerStarted","Data":"a3c25a2d633e8c72c1e729e764f21aebe7bcecc9a4ea629ca2a7b336f436766f"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.461053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"75f38ef9-3fc9-428a-8364-96c3938d69e5","Type":"ContainerStarted","Data":"ba20aec992260ce95cf4adf7116d8900fd95da0e7241e2c8716cf6312120ef1e"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.464166 4717 generic.go:334] "Generic (PLEG): container finished" podID="51572320-b28e-45be-ba55-524f9e0ccc61" containerID="79cde14ec346533f4cb0f07d8ef9a77367c860724692c83afa91af724a0e0e22" exitCode=0 Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.464238 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sls6z" event={"ID":"51572320-b28e-45be-ba55-524f9e0ccc61","Type":"ContainerDied","Data":"79cde14ec346533f4cb0f07d8ef9a77367c860724692c83afa91af724a0e0e22"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.471621 4717 generic.go:334] "Generic (PLEG): container finished" podID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerID="2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e" exitCode=0 Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.471901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" event={"ID":"192e0436-8d26-476c-b6fc-d2d43f26dc1a","Type":"ContainerDied","Data":"2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.484455 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.931215206 podStartE2EDuration="17.484434321s" podCreationTimestamp="2026-02-21 22:01:10 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.907339689 +0000 UTC m=+890.688873311" lastFinishedPulling="2026-02-21 22:01:25.460558804 +0000 UTC m=+900.242092426" observedRunningTime="2026-02-21 22:01:27.476668025 +0000 UTC m=+902.258201717" watchObservedRunningTime="2026-02-21 22:01:27.484434321 +0000 UTC m=+902.265967973" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.496039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" event={"ID":"8b22bbd0-3a57-4403-98f2-8781f50f68da","Type":"ContainerStarted","Data":"b5bb9b32ff79c830f543f8c7d0dc5e57c45211e22b4c85f175499e1247eecd39"} Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.543503 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.715715369 podStartE2EDuration="13.543459421s" podCreationTimestamp="2026-02-21 22:01:14 +0000 UTC" firstStartedPulling="2026-02-21 22:01:16.632765871 +0000 UTC m=+891.414299493" lastFinishedPulling="2026-02-21 22:01:25.460509883 +0000 UTC m=+900.242043545" observedRunningTime="2026-02-21 22:01:27.536961585 +0000 UTC m=+902.318495237" watchObservedRunningTime="2026-02-21 22:01:27.543459421 +0000 UTC m=+902.324993053" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.557473 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kllk6" podStartSLOduration=3.227545831 podStartE2EDuration="11.557456425s" podCreationTimestamp="2026-02-21 22:01:16 +0000 UTC" firstStartedPulling="2026-02-21 22:01:17.427208282 +0000 UTC m=+892.208741904" lastFinishedPulling="2026-02-21 22:01:25.757118836 +0000 UTC m=+900.538652498" observedRunningTime="2026-02-21 22:01:27.555067527 +0000 UTC m=+902.336601159" watchObservedRunningTime="2026-02-21 22:01:27.557456425 +0000 UTC m=+902.338990047" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.678824 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-828xd" podStartSLOduration=7.197159137 podStartE2EDuration="16.678800183s" podCreationTimestamp="2026-02-21 22:01:11 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.979298728 +0000 UTC m=+890.760832350" lastFinishedPulling="2026-02-21 22:01:25.460939734 +0000 UTC m=+900.242473396" observedRunningTime="2026-02-21 22:01:27.664456259 +0000 UTC m=+902.445989881" watchObservedRunningTime="2026-02-21 22:01:27.678800183 +0000 UTC m=+902.460333795" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.752450 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" podStartSLOduration=11.75242971 podStartE2EDuration="11.75242971s" podCreationTimestamp="2026-02-21 22:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:27.743401255 +0000 UTC m=+902.524934877" watchObservedRunningTime="2026-02-21 22:01:27.75242971 +0000 UTC m=+902.533963332" Feb 21 22:01:27 crc kubenswrapper[4717]: I0221 22:01:27.944444 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.509336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sls6z" event={"ID":"51572320-b28e-45be-ba55-524f9e0ccc61","Type":"ContainerStarted","Data":"2d43097932efa5ffc955a8366cb9e1e6afd1768756e375ff7fbf8bbeab1dca30"} Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.511821 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.512077 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sls6z" event={"ID":"51572320-b28e-45be-ba55-524f9e0ccc61","Type":"ContainerStarted","Data":"d1215602c95bfe110bf969061571ef3d29812d529bbd746d2cabe02f7d743357"} Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.512230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" event={"ID":"192e0436-8d26-476c-b6fc-d2d43f26dc1a","Type":"ContainerStarted","Data":"b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894"} Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.513061 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.513798 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.551540 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-sls6z" podStartSLOduration=8.46917533 podStartE2EDuration="17.551514141s" podCreationTimestamp="2026-02-21 22:01:11 +0000 UTC" firstStartedPulling="2026-02-21 22:01:16.03255749 +0000 UTC m=+890.814091112" lastFinishedPulling="2026-02-21 22:01:25.114896301 +0000 UTC m=+899.896429923" observedRunningTime="2026-02-21 22:01:28.546426869 +0000 UTC m=+903.327960491" watchObservedRunningTime="2026-02-21 22:01:28.551514141 +0000 UTC m=+903.333047773" Feb 21 22:01:28 crc kubenswrapper[4717]: I0221 22:01:28.568220 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" podStartSLOduration=12.56819449 podStartE2EDuration="12.56819449s" podCreationTimestamp="2026-02-21 22:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:28.560623689 +0000 UTC m=+903.342157321" watchObservedRunningTime="2026-02-21 22:01:28.56819449 +0000 UTC m=+903.349728122" Feb 21 22:01:29 crc kubenswrapper[4717]: I0221 22:01:29.525380 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:29 crc kubenswrapper[4717]: I0221 22:01:29.931743 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:29 crc kubenswrapper[4717]: I0221 22:01:29.997278 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:30 crc kubenswrapper[4717]: I0221 22:01:30.531298 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:30 crc kubenswrapper[4717]: I0221 22:01:30.944427 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.021651 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.078424 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.596055 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.605853 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.954688 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 21 22:01:31 crc kubenswrapper[4717]: E0221 22:01:31.954992 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="init" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.955004 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="init" Feb 21 22:01:31 crc kubenswrapper[4717]: E0221 22:01:31.955014 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="dnsmasq-dns" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.955020 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="dnsmasq-dns" Feb 21 22:01:31 crc kubenswrapper[4717]: E0221 22:01:31.955034 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="dnsmasq-dns" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.955040 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="dnsmasq-dns" Feb 21 22:01:31 crc kubenswrapper[4717]: E0221 22:01:31.955054 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="init" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.955060 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="init" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.955186 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3da2c9-9230-4114-b6fd-95f79b5d7c3c" containerName="dnsmasq-dns" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.955204 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38ebc64-08ee-432f-a8af-85e84a8608ee" containerName="dnsmasq-dns" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.956382 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.969441 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-d8kvc" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.969639 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.969738 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 21 22:01:31 crc kubenswrapper[4717]: I0221 22:01:31.970416 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.001481 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.043580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/34cc3509-6f63-43c2-86a3-284360464284-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.043689 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.043952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.043996 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.044044 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kh5n\" (UniqueName: \"kubernetes.io/projected/34cc3509-6f63-43c2-86a3-284360464284-kube-api-access-5kh5n\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.044129 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cc3509-6f63-43c2-86a3-284360464284-scripts\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.044259 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34cc3509-6f63-43c2-86a3-284360464284-config\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145732 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/34cc3509-6f63-43c2-86a3-284360464284-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145795 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145834 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145873 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kh5n\" (UniqueName: \"kubernetes.io/projected/34cc3509-6f63-43c2-86a3-284360464284-kube-api-access-5kh5n\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cc3509-6f63-43c2-86a3-284360464284-scripts\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.145960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34cc3509-6f63-43c2-86a3-284360464284-config\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.146320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/34cc3509-6f63-43c2-86a3-284360464284-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.146936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34cc3509-6f63-43c2-86a3-284360464284-config\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.146973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34cc3509-6f63-43c2-86a3-284360464284-scripts\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.151786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.151795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.152574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cc3509-6f63-43c2-86a3-284360464284-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.162234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kh5n\" (UniqueName: \"kubernetes.io/projected/34cc3509-6f63-43c2-86a3-284360464284-kube-api-access-5kh5n\") pod \"ovn-northd-0\" (UID: \"34cc3509-6f63-43c2-86a3-284360464284\") " pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.189672 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.243450 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ktd86"] Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.243671 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerName="dnsmasq-dns" containerID="cri-o://b5bb9b32ff79c830f543f8c7d0dc5e57c45211e22b4c85f175499e1247eecd39" gracePeriod=10 Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.247212 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.306652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.565243 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerID="b5bb9b32ff79c830f543f8c7d0dc5e57c45211e22b4c85f175499e1247eecd39" exitCode=0 Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.565416 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" event={"ID":"8b22bbd0-3a57-4403-98f2-8781f50f68da","Type":"ContainerDied","Data":"b5bb9b32ff79c830f543f8c7d0dc5e57c45211e22b4c85f175499e1247eecd39"} Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.567464 4717 generic.go:334] "Generic (PLEG): container finished" podID="4a73e4bf-8575-43f8-bfff-35b8ca593732" containerID="df6b5625c8f2d63c0cc9451de169f2392535e259f79ea3f3e9a5481ac21af337" exitCode=0 Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.568217 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4a73e4bf-8575-43f8-bfff-35b8ca593732","Type":"ContainerDied","Data":"df6b5625c8f2d63c0cc9451de169f2392535e259f79ea3f3e9a5481ac21af337"} Feb 21 22:01:32 crc kubenswrapper[4717]: I0221 22:01:32.728803 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 21 22:01:32 crc kubenswrapper[4717]: W0221 22:01:32.732508 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34cc3509_6f63_43c2_86a3_284360464284.slice/crio-833a9a924c0ab738680bfaed52db3bd49b41ee5bb2f64e0ee21b80c6c6579198 WatchSource:0}: Error finding container 833a9a924c0ab738680bfaed52db3bd49b41ee5bb2f64e0ee21b80c6c6579198: Status 404 returned error can't find the container with id 833a9a924c0ab738680bfaed52db3bd49b41ee5bb2f64e0ee21b80c6c6579198 Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.576470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4a73e4bf-8575-43f8-bfff-35b8ca593732","Type":"ContainerStarted","Data":"7710a842d6828cad5c694c0aca665a6a2a7474087d5978c88a5c9b4fd3987b0c"} Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.578471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"34cc3509-6f63-43c2-86a3-284360464284","Type":"ContainerStarted","Data":"833a9a924c0ab738680bfaed52db3bd49b41ee5bb2f64e0ee21b80c6c6579198"} Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.580248 4717 generic.go:334] "Generic (PLEG): container finished" podID="e49285f0-879f-40db-8eb9-2e8e18a87bb7" containerID="fa3b40a4b90039a82ff3830277b2df2aa64b9457852fdc4a4d6ad7fd8b2e9cd3" exitCode=0 Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.580317 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e49285f0-879f-40db-8eb9-2e8e18a87bb7","Type":"ContainerDied","Data":"fa3b40a4b90039a82ff3830277b2df2aa64b9457852fdc4a4d6ad7fd8b2e9cd3"} Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.600136 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.676474344 podStartE2EDuration="29.600114334s" podCreationTimestamp="2026-02-21 22:01:04 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.791109564 +0000 UTC m=+890.572643186" lastFinishedPulling="2026-02-21 22:01:25.714749514 +0000 UTC m=+900.496283176" observedRunningTime="2026-02-21 22:01:33.595657228 +0000 UTC m=+908.377190850" watchObservedRunningTime="2026-02-21 22:01:33.600114334 +0000 UTC m=+908.381647956" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.865149 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.877688 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hh6q\" (UniqueName: \"kubernetes.io/projected/8b22bbd0-3a57-4403-98f2-8781f50f68da-kube-api-access-8hh6q\") pod \"8b22bbd0-3a57-4403-98f2-8781f50f68da\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.877773 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-dns-svc\") pod \"8b22bbd0-3a57-4403-98f2-8781f50f68da\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.877931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-config\") pod \"8b22bbd0-3a57-4403-98f2-8781f50f68da\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.878013 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-ovsdbserver-nb\") pod \"8b22bbd0-3a57-4403-98f2-8781f50f68da\" (UID: \"8b22bbd0-3a57-4403-98f2-8781f50f68da\") " Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.898079 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b22bbd0-3a57-4403-98f2-8781f50f68da-kube-api-access-8hh6q" (OuterVolumeSpecName: "kube-api-access-8hh6q") pod "8b22bbd0-3a57-4403-98f2-8781f50f68da" (UID: "8b22bbd0-3a57-4403-98f2-8781f50f68da"). InnerVolumeSpecName "kube-api-access-8hh6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.938404 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b22bbd0-3a57-4403-98f2-8781f50f68da" (UID: "8b22bbd0-3a57-4403-98f2-8781f50f68da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.953690 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-config" (OuterVolumeSpecName: "config") pod "8b22bbd0-3a57-4403-98f2-8781f50f68da" (UID: "8b22bbd0-3a57-4403-98f2-8781f50f68da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.974903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b22bbd0-3a57-4403-98f2-8781f50f68da" (UID: "8b22bbd0-3a57-4403-98f2-8781f50f68da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.979645 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.979682 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.979697 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hh6q\" (UniqueName: \"kubernetes.io/projected/8b22bbd0-3a57-4403-98f2-8781f50f68da-kube-api-access-8hh6q\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:33 crc kubenswrapper[4717]: I0221 22:01:33.979706 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b22bbd0-3a57-4403-98f2-8781f50f68da-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.592151 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" event={"ID":"8b22bbd0-3a57-4403-98f2-8781f50f68da","Type":"ContainerDied","Data":"8bfbfccbb9c2accd711188f07f858df2d0cc5319f5152cdedc674987fbfc52e2"} Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.592217 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ktd86" Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.592822 4717 scope.go:117] "RemoveContainer" containerID="b5bb9b32ff79c830f543f8c7d0dc5e57c45211e22b4c85f175499e1247eecd39" Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.594020 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2e08bfd4-7ef2-4895-89e2-c9265d0adc13","Type":"ContainerStarted","Data":"69448ec46413811c070cdb37b272c6eeaec778e1fd6a5eca434201575ec96f11"} Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.594172 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.599135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e49285f0-879f-40db-8eb9-2e8e18a87bb7","Type":"ContainerStarted","Data":"ff45b84b5f0d99f3566781465a4a30fc648ce2472a5eed5e14acd396e37be290"} Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.602801 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"34cc3509-6f63-43c2-86a3-284360464284","Type":"ContainerStarted","Data":"f09ddaf2e9431e13a5a99fcf9228d99a0a65186c39eff2853135ed550fdfb9ec"} Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.631398 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ktd86"] Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.632891 4717 scope.go:117] "RemoveContainer" containerID="821690f2ead9f6949b5443f1437473d6780c33184697b10d3b441059c3cbf2f3" Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.647524 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ktd86"] Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.684828 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.92511437 podStartE2EDuration="32.684803465s" podCreationTimestamp="2026-02-21 22:01:02 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.679817927 +0000 UTC m=+890.461351549" lastFinishedPulling="2026-02-21 22:01:25.439507012 +0000 UTC m=+900.221040644" observedRunningTime="2026-02-21 22:01:34.653736742 +0000 UTC m=+909.435270394" watchObservedRunningTime="2026-02-21 22:01:34.684803465 +0000 UTC m=+909.466337127" Feb 21 22:01:34 crc kubenswrapper[4717]: I0221 22:01:34.686243 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.787073907 podStartE2EDuration="27.686230019s" podCreationTimestamp="2026-02-21 22:01:07 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.817476854 +0000 UTC m=+890.599010476" lastFinishedPulling="2026-02-21 22:01:33.716632976 +0000 UTC m=+908.498166588" observedRunningTime="2026-02-21 22:01:34.68208147 +0000 UTC m=+909.463615102" watchObservedRunningTime="2026-02-21 22:01:34.686230019 +0000 UTC m=+909.467763681" Feb 21 22:01:35 crc kubenswrapper[4717]: I0221 22:01:35.623974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"34cc3509-6f63-43c2-86a3-284360464284","Type":"ContainerStarted","Data":"453e92c9e25fd0d4f8f4943cbe6169835a73a865b14d7cb3ff7f73e104ac1ad6"} Feb 21 22:01:35 crc kubenswrapper[4717]: I0221 22:01:35.624387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 21 22:01:35 crc kubenswrapper[4717]: I0221 22:01:35.659559 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.107644801 podStartE2EDuration="4.659529049s" podCreationTimestamp="2026-02-21 22:01:31 +0000 UTC" firstStartedPulling="2026-02-21 22:01:32.735663011 +0000 UTC m=+907.517196643" lastFinishedPulling="2026-02-21 22:01:34.287547269 +0000 UTC m=+909.069080891" observedRunningTime="2026-02-21 22:01:35.652810317 +0000 UTC m=+910.434343979" watchObservedRunningTime="2026-02-21 22:01:35.659529049 +0000 UTC m=+910.441062701" Feb 21 22:01:35 crc kubenswrapper[4717]: I0221 22:01:35.718128 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:35 crc kubenswrapper[4717]: I0221 22:01:35.718211 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:35 crc kubenswrapper[4717]: I0221 22:01:35.990679 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" path="/var/lib/kubelet/pods/8b22bbd0-3a57-4403-98f2-8781f50f68da/volumes" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.298381 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-v7s8s"] Feb 21 22:01:38 crc kubenswrapper[4717]: E0221 22:01:38.298913 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerName="dnsmasq-dns" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.298925 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerName="dnsmasq-dns" Feb 21 22:01:38 crc kubenswrapper[4717]: E0221 22:01:38.298956 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerName="init" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.298962 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerName="init" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.299104 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b22bbd0-3a57-4403-98f2-8781f50f68da" containerName="dnsmasq-dns" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.299840 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.320286 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-v7s8s"] Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.342708 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.435381 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.445587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-config\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.445654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.445814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.446804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-dns-svc\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.446855 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcvk2\" (UniqueName: \"kubernetes.io/projected/ea234932-2730-4dfa-9e21-91bc7575a885-kube-api-access-tcvk2\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.548595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-config\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.548639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.548704 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.548777 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-dns-svc\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.548799 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcvk2\" (UniqueName: \"kubernetes.io/projected/ea234932-2730-4dfa-9e21-91bc7575a885-kube-api-access-tcvk2\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.549754 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.549782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.549845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-dns-svc\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.550377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-config\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.570613 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcvk2\" (UniqueName: \"kubernetes.io/projected/ea234932-2730-4dfa-9e21-91bc7575a885-kube-api-access-tcvk2\") pod \"dnsmasq-dns-698758b865-v7s8s\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:38 crc kubenswrapper[4717]: I0221 22:01:38.615379 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.062537 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.062751 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.084400 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-v7s8s"] Feb 21 22:01:39 crc kubenswrapper[4717]: W0221 22:01:39.091612 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea234932_2730_4dfa_9e21_91bc7575a885.slice/crio-ec030064b7cac483acc7a051a8df35404d40eb1335338dbff253d68be60b9517 WatchSource:0}: Error finding container ec030064b7cac483acc7a051a8df35404d40eb1335338dbff253d68be60b9517: Status 404 returned error can't find the container with id ec030064b7cac483acc7a051a8df35404d40eb1335338dbff253d68be60b9517 Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.521293 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.530476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.537018 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.539330 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.538746 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.540067 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gcbfh" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.551616 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.659386 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea234932-2730-4dfa-9e21-91bc7575a885" containerID="a8ad14f330d1b3520f6f9087e70cfa2f8da47f1603e8986db1a7b3b9814e50ae" exitCode=0 Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.659447 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-v7s8s" event={"ID":"ea234932-2730-4dfa-9e21-91bc7575a885","Type":"ContainerDied","Data":"a8ad14f330d1b3520f6f9087e70cfa2f8da47f1603e8986db1a7b3b9814e50ae"} Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.659491 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-v7s8s" event={"ID":"ea234932-2730-4dfa-9e21-91bc7575a885","Type":"ContainerStarted","Data":"ec030064b7cac483acc7a051a8df35404d40eb1335338dbff253d68be60b9517"} Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.666968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.667366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.667617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fc309c6-44f4-4daf-90fa-6bf6845f195d-lock\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.667949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g5h9\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-kube-api-access-9g5h9\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.668187 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc309c6-44f4-4daf-90fa-6bf6845f195d-cache\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.668429 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc309c6-44f4-4daf-90fa-6bf6845f195d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.769970 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc309c6-44f4-4daf-90fa-6bf6845f195d-cache\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc309c6-44f4-4daf-90fa-6bf6845f195d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fc309c6-44f4-4daf-90fa-6bf6845f195d-lock\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g5h9\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-kube-api-access-9g5h9\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770387 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fc309c6-44f4-4daf-90fa-6bf6845f195d-lock\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: E0221 22:01:39.770598 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.770605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fc309c6-44f4-4daf-90fa-6bf6845f195d-cache\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: E0221 22:01:39.770612 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 21 22:01:39 crc kubenswrapper[4717]: E0221 22:01:39.770892 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift podName:1fc309c6-44f4-4daf-90fa-6bf6845f195d nodeName:}" failed. No retries permitted until 2026-02-21 22:01:40.270848941 +0000 UTC m=+915.052382573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift") pod "swift-storage-0" (UID: "1fc309c6-44f4-4daf-90fa-6bf6845f195d") : configmap "swift-ring-files" not found Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.775162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc309c6-44f4-4daf-90fa-6bf6845f195d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.792975 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g5h9\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-kube-api-access-9g5h9\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:39 crc kubenswrapper[4717]: I0221 22:01:39.797652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.040855 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ssj5d"] Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.042461 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.047768 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.048579 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.054117 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.075287 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ssj5d"] Feb 21 22:01:40 crc kubenswrapper[4717]: E0221 22:01:40.076020 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-gf2c2 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-gf2c2 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-ssj5d" podUID="c2abea91-4593-4367-98cb-407420c51b98" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.082904 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5zc8b"] Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.084472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.094329 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5zc8b"] Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.128378 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ssj5d"] Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2abea91-4593-4367-98cb-407420c51b98-etc-swift\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-swiftconf\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqt2\" (UniqueName: \"kubernetes.io/projected/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-kube-api-access-xrqt2\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181816 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-dispersionconf\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181885 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf2c2\" (UniqueName: \"kubernetes.io/projected/c2abea91-4593-4367-98cb-407420c51b98-kube-api-access-gf2c2\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-ring-data-devices\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.181941 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-scripts\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-dispersionconf\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-combined-ca-bundle\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182301 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-swiftconf\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-combined-ca-bundle\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182492 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-ring-data-devices\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-etc-swift\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.182613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-scripts\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.284772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-ring-data-devices\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.284932 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-etc-swift\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.285003 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.285064 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-scripts\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.285157 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2abea91-4593-4367-98cb-407420c51b98-etc-swift\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: E0221 22:01:40.285322 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 21 22:01:40 crc kubenswrapper[4717]: E0221 22:01:40.285592 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.285787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-etc-swift\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2abea91-4593-4367-98cb-407420c51b98-etc-swift\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-scripts\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-swiftconf\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqt2\" (UniqueName: \"kubernetes.io/projected/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-kube-api-access-xrqt2\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-dispersionconf\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf2c2\" (UniqueName: \"kubernetes.io/projected/c2abea91-4593-4367-98cb-407420c51b98-kube-api-access-gf2c2\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286267 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-ring-data-devices\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-scripts\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-dispersionconf\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-ring-data-devices\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-combined-ca-bundle\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286549 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-swiftconf\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.286620 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-combined-ca-bundle\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: E0221 22:01:40.287079 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift podName:1fc309c6-44f4-4daf-90fa-6bf6845f195d nodeName:}" failed. No retries permitted until 2026-02-21 22:01:41.287059582 +0000 UTC m=+916.068593304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift") pod "swift-storage-0" (UID: "1fc309c6-44f4-4daf-90fa-6bf6845f195d") : configmap "swift-ring-files" not found Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.287137 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-ring-data-devices\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.288120 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-scripts\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.291348 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-dispersionconf\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.291363 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-dispersionconf\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.292110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-combined-ca-bundle\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.294644 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-swiftconf\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.296661 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-swiftconf\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.296698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-combined-ca-bundle\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.312649 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf2c2\" (UniqueName: \"kubernetes.io/projected/c2abea91-4593-4367-98cb-407420c51b98-kube-api-access-gf2c2\") pod \"swift-ring-rebalance-ssj5d\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.317743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqt2\" (UniqueName: \"kubernetes.io/projected/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-kube-api-access-xrqt2\") pod \"swift-ring-rebalance-5zc8b\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.404096 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.669431 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.669712 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-v7s8s" event={"ID":"ea234932-2730-4dfa-9e21-91bc7575a885","Type":"ContainerStarted","Data":"a6a3a5694ed3bec429f0997e51c6a13341eb16d4169e60fc261ba7937e0b5acf"} Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.695777 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.701986 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-v7s8s" podStartSLOduration=2.701971565 podStartE2EDuration="2.701971565s" podCreationTimestamp="2026-02-21 22:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:40.697122359 +0000 UTC m=+915.478655981" watchObservedRunningTime="2026-02-21 22:01:40.701971565 +0000 UTC m=+915.483505177" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-ring-data-devices\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798160 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf2c2\" (UniqueName: \"kubernetes.io/projected/c2abea91-4593-4367-98cb-407420c51b98-kube-api-access-gf2c2\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798208 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2abea91-4593-4367-98cb-407420c51b98-etc-swift\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798248 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-swiftconf\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-combined-ca-bundle\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798310 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-dispersionconf\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798374 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-scripts\") pod \"c2abea91-4593-4367-98cb-407420c51b98\" (UID: \"c2abea91-4593-4367-98cb-407420c51b98\") " Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798502 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2abea91-4593-4367-98cb-407420c51b98-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798675 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.798959 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-scripts" (OuterVolumeSpecName: "scripts") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.799525 4717 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2abea91-4593-4367-98cb-407420c51b98-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.799543 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.799553 4717 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2abea91-4593-4367-98cb-407420c51b98-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.802981 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2abea91-4593-4367-98cb-407420c51b98-kube-api-access-gf2c2" (OuterVolumeSpecName: "kube-api-access-gf2c2") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "kube-api-access-gf2c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.803087 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.803188 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.804605 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c2abea91-4593-4367-98cb-407420c51b98" (UID: "c2abea91-4593-4367-98cb-407420c51b98"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.902072 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5zc8b"] Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.903039 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf2c2\" (UniqueName: \"kubernetes.io/projected/c2abea91-4593-4367-98cb-407420c51b98-kube-api-access-gf2c2\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.903065 4717 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.903081 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: I0221 22:01:40.903093 4717 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2abea91-4593-4367-98cb-407420c51b98-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:40 crc kubenswrapper[4717]: W0221 22:01:40.909596 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d1b5d67_1e8c_4c1f_a6a3_9634827165f8.slice/crio-4e0d27e956eb54c12c1290759b95385954f082780fe54bdea0c766502bdb1c83 WatchSource:0}: Error finding container 4e0d27e956eb54c12c1290759b95385954f082780fe54bdea0c766502bdb1c83: Status 404 returned error can't find the container with id 4e0d27e956eb54c12c1290759b95385954f082780fe54bdea0c766502bdb1c83 Feb 21 22:01:41 crc kubenswrapper[4717]: I0221 22:01:41.309282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:41 crc kubenswrapper[4717]: E0221 22:01:41.309516 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 21 22:01:41 crc kubenswrapper[4717]: E0221 22:01:41.309648 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 21 22:01:41 crc kubenswrapper[4717]: E0221 22:01:41.309718 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift podName:1fc309c6-44f4-4daf-90fa-6bf6845f195d nodeName:}" failed. No retries permitted until 2026-02-21 22:01:43.309698068 +0000 UTC m=+918.091231690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift") pod "swift-storage-0" (UID: "1fc309c6-44f4-4daf-90fa-6bf6845f195d") : configmap "swift-ring-files" not found Feb 21 22:01:41 crc kubenswrapper[4717]: I0221 22:01:41.688066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ssj5d" Feb 21 22:01:41 crc kubenswrapper[4717]: I0221 22:01:41.688343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5zc8b" event={"ID":"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8","Type":"ContainerStarted","Data":"4e0d27e956eb54c12c1290759b95385954f082780fe54bdea0c766502bdb1c83"} Feb 21 22:01:41 crc kubenswrapper[4717]: I0221 22:01:41.688837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:41 crc kubenswrapper[4717]: I0221 22:01:41.738136 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ssj5d"] Feb 21 22:01:41 crc kubenswrapper[4717]: I0221 22:01:41.745039 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-ssj5d"] Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.008208 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2abea91-4593-4367-98cb-407420c51b98" path="/var/lib/kubelet/pods/c2abea91-4593-4367-98cb-407420c51b98/volumes" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.335361 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ncbrn"] Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.341384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.344486 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncbrn"] Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.427926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-catalog-content\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.428286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-utilities\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.428411 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2rn5\" (UniqueName: \"kubernetes.io/projected/6bed5ac2-5539-4722-9432-1ee6618783bc-kube-api-access-m2rn5\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.529627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-utilities\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.529753 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2rn5\" (UniqueName: \"kubernetes.io/projected/6bed5ac2-5539-4722-9432-1ee6618783bc-kube-api-access-m2rn5\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.529815 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-catalog-content\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.530341 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-catalog-content\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.530526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-utilities\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.551936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2rn5\" (UniqueName: \"kubernetes.io/projected/6bed5ac2-5539-4722-9432-1ee6618783bc-kube-api-access-m2rn5\") pod \"redhat-operators-ncbrn\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:42 crc kubenswrapper[4717]: I0221 22:01:42.660456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:43 crc kubenswrapper[4717]: I0221 22:01:43.341620 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:43 crc kubenswrapper[4717]: E0221 22:01:43.342092 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 21 22:01:43 crc kubenswrapper[4717]: E0221 22:01:43.342106 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 21 22:01:43 crc kubenswrapper[4717]: E0221 22:01:43.342150 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift podName:1fc309c6-44f4-4daf-90fa-6bf6845f195d nodeName:}" failed. No retries permitted until 2026-02-21 22:01:47.342135244 +0000 UTC m=+922.123668866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift") pod "swift-storage-0" (UID: "1fc309c6-44f4-4daf-90fa-6bf6845f195d") : configmap "swift-ring-files" not found Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.267030 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.267121 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.359345 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.412373 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-75qq2"] Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.415603 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.417796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.432368 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-75qq2"] Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.562020 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvtj\" (UniqueName: \"kubernetes.io/projected/be146b59-8dd7-4274-aa00-9d9872fddf6e-kube-api-access-wfvtj\") pod \"root-account-create-update-75qq2\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.562478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be146b59-8dd7-4274-aa00-9d9872fddf6e-operator-scripts\") pod \"root-account-create-update-75qq2\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.664047 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvtj\" (UniqueName: \"kubernetes.io/projected/be146b59-8dd7-4274-aa00-9d9872fddf6e-kube-api-access-wfvtj\") pod \"root-account-create-update-75qq2\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.664105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be146b59-8dd7-4274-aa00-9d9872fddf6e-operator-scripts\") pod \"root-account-create-update-75qq2\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.664804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be146b59-8dd7-4274-aa00-9d9872fddf6e-operator-scripts\") pod \"root-account-create-update-75qq2\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.683566 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvtj\" (UniqueName: \"kubernetes.io/projected/be146b59-8dd7-4274-aa00-9d9872fddf6e-kube-api-access-wfvtj\") pod \"root-account-create-update-75qq2\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.741997 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:44 crc kubenswrapper[4717]: I0221 22:01:44.793953 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.005839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncbrn"] Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.267260 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-75qq2"] Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.716006 4717 generic.go:334] "Generic (PLEG): container finished" podID="be146b59-8dd7-4274-aa00-9d9872fddf6e" containerID="73faedea9a78734027ca32d988a5bd77149a3417bc03ba797c60133876399194" exitCode=0 Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.716078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-75qq2" event={"ID":"be146b59-8dd7-4274-aa00-9d9872fddf6e","Type":"ContainerDied","Data":"73faedea9a78734027ca32d988a5bd77149a3417bc03ba797c60133876399194"} Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.716105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-75qq2" event={"ID":"be146b59-8dd7-4274-aa00-9d9872fddf6e","Type":"ContainerStarted","Data":"06c4bba9b48c6cf9a7a40939e7d09f87af2963976d103f0bbf65ad5f5a1befc0"} Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.720632 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5zc8b" event={"ID":"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8","Type":"ContainerStarted","Data":"8da67497e05dfe26e54c5054fb90be037db085c2113602bb1f1601030f98c4c5"} Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.724331 4717 generic.go:334] "Generic (PLEG): container finished" podID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerID="28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22" exitCode=0 Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.725748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncbrn" event={"ID":"6bed5ac2-5539-4722-9432-1ee6618783bc","Type":"ContainerDied","Data":"28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22"} Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.725782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncbrn" event={"ID":"6bed5ac2-5539-4722-9432-1ee6618783bc","Type":"ContainerStarted","Data":"d3dc1b2459edc31ec4473237925f910ca6be275326ce32b7fccb45735da21c64"} Feb 21 22:01:45 crc kubenswrapper[4717]: I0221 22:01:45.791027 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-5zc8b" podStartSLOduration=2.157191577 podStartE2EDuration="5.791003958s" podCreationTimestamp="2026-02-21 22:01:40 +0000 UTC" firstStartedPulling="2026-02-21 22:01:40.911140977 +0000 UTC m=+915.692674599" lastFinishedPulling="2026-02-21 22:01:44.544953358 +0000 UTC m=+919.326486980" observedRunningTime="2026-02-21 22:01:45.759952724 +0000 UTC m=+920.541486346" watchObservedRunningTime="2026-02-21 22:01:45.791003958 +0000 UTC m=+920.572537580" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.120807 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wcf2g"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.130548 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wcf2g"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.130638 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.194282 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.207941 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskvg\" (UniqueName: \"kubernetes.io/projected/4b97d305-a098-430c-9afd-36a981d3f978-kube-api-access-gskvg\") pod \"keystone-db-create-wcf2g\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.207987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b97d305-a098-430c-9afd-36a981d3f978-operator-scripts\") pod \"keystone-db-create-wcf2g\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.226733 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0e51-account-create-update-xtqdq"] Feb 21 22:01:47 crc kubenswrapper[4717]: E0221 22:01:47.227086 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be146b59-8dd7-4274-aa00-9d9872fddf6e" containerName="mariadb-account-create-update" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.227100 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="be146b59-8dd7-4274-aa00-9d9872fddf6e" containerName="mariadb-account-create-update" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.227352 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="be146b59-8dd7-4274-aa00-9d9872fddf6e" containerName="mariadb-account-create-update" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.227825 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.229832 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.242796 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0e51-account-create-update-xtqdq"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.308580 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be146b59-8dd7-4274-aa00-9d9872fddf6e-operator-scripts\") pod \"be146b59-8dd7-4274-aa00-9d9872fddf6e\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.309135 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be146b59-8dd7-4274-aa00-9d9872fddf6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be146b59-8dd7-4274-aa00-9d9872fddf6e" (UID: "be146b59-8dd7-4274-aa00-9d9872fddf6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.309410 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvtj\" (UniqueName: \"kubernetes.io/projected/be146b59-8dd7-4274-aa00-9d9872fddf6e-kube-api-access-wfvtj\") pod \"be146b59-8dd7-4274-aa00-9d9872fddf6e\" (UID: \"be146b59-8dd7-4274-aa00-9d9872fddf6e\") " Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.310392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt2dj\" (UniqueName: \"kubernetes.io/projected/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-kube-api-access-vt2dj\") pod \"keystone-0e51-account-create-update-xtqdq\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.311409 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-operator-scripts\") pod \"keystone-0e51-account-create-update-xtqdq\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.311508 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskvg\" (UniqueName: \"kubernetes.io/projected/4b97d305-a098-430c-9afd-36a981d3f978-kube-api-access-gskvg\") pod \"keystone-db-create-wcf2g\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.312052 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b97d305-a098-430c-9afd-36a981d3f978-operator-scripts\") pod \"keystone-db-create-wcf2g\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.312784 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b97d305-a098-430c-9afd-36a981d3f978-operator-scripts\") pod \"keystone-db-create-wcf2g\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.312963 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be146b59-8dd7-4274-aa00-9d9872fddf6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.320784 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be146b59-8dd7-4274-aa00-9d9872fddf6e-kube-api-access-wfvtj" (OuterVolumeSpecName: "kube-api-access-wfvtj") pod "be146b59-8dd7-4274-aa00-9d9872fddf6e" (UID: "be146b59-8dd7-4274-aa00-9d9872fddf6e"). InnerVolumeSpecName "kube-api-access-wfvtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.326370 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskvg\" (UniqueName: \"kubernetes.io/projected/4b97d305-a098-430c-9afd-36a981d3f978-kube-api-access-gskvg\") pod \"keystone-db-create-wcf2g\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.414768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.414970 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt2dj\" (UniqueName: \"kubernetes.io/projected/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-kube-api-access-vt2dj\") pod \"keystone-0e51-account-create-update-xtqdq\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: E0221 22:01:47.415016 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 21 22:01:47 crc kubenswrapper[4717]: E0221 22:01:47.415051 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 21 22:01:47 crc kubenswrapper[4717]: E0221 22:01:47.415123 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift podName:1fc309c6-44f4-4daf-90fa-6bf6845f195d nodeName:}" failed. No retries permitted until 2026-02-21 22:01:55.415098668 +0000 UTC m=+930.196632300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift") pod "swift-storage-0" (UID: "1fc309c6-44f4-4daf-90fa-6bf6845f195d") : configmap "swift-ring-files" not found Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.415558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-operator-scripts\") pod \"keystone-0e51-account-create-update-xtqdq\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.415662 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvtj\" (UniqueName: \"kubernetes.io/projected/be146b59-8dd7-4274-aa00-9d9872fddf6e-kube-api-access-wfvtj\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.416452 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-operator-scripts\") pod \"keystone-0e51-account-create-update-xtqdq\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.427469 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rms66"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.428594 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.442727 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rms66"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.447629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt2dj\" (UniqueName: \"kubernetes.io/projected/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-kube-api-access-vt2dj\") pod \"keystone-0e51-account-create-update-xtqdq\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.507709 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.516770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgrx\" (UniqueName: \"kubernetes.io/projected/362d581c-08cc-41a5-9327-209994947262-kube-api-access-bpgrx\") pod \"placement-db-create-rms66\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.516965 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362d581c-08cc-41a5-9327-209994947262-operator-scripts\") pod \"placement-db-create-rms66\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.527306 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fc8f-account-create-update-vfrcv"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.528294 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.530379 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.541001 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc8f-account-create-update-vfrcv"] Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.548498 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.624779 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/045ab09f-8170-4815-8af2-7511d0a29f3e-operator-scripts\") pod \"placement-fc8f-account-create-update-vfrcv\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.625303 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgrx\" (UniqueName: \"kubernetes.io/projected/362d581c-08cc-41a5-9327-209994947262-kube-api-access-bpgrx\") pod \"placement-db-create-rms66\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.625474 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362d581c-08cc-41a5-9327-209994947262-operator-scripts\") pod \"placement-db-create-rms66\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.625569 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2sj\" (UniqueName: \"kubernetes.io/projected/045ab09f-8170-4815-8af2-7511d0a29f3e-kube-api-access-cb2sj\") pod \"placement-fc8f-account-create-update-vfrcv\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.627162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362d581c-08cc-41a5-9327-209994947262-operator-scripts\") pod \"placement-db-create-rms66\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.650292 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgrx\" (UniqueName: \"kubernetes.io/projected/362d581c-08cc-41a5-9327-209994947262-kube-api-access-bpgrx\") pod \"placement-db-create-rms66\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.735817 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/045ab09f-8170-4815-8af2-7511d0a29f3e-operator-scripts\") pod \"placement-fc8f-account-create-update-vfrcv\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.735952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2sj\" (UniqueName: \"kubernetes.io/projected/045ab09f-8170-4815-8af2-7511d0a29f3e-kube-api-access-cb2sj\") pod \"placement-fc8f-account-create-update-vfrcv\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.736440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/045ab09f-8170-4815-8af2-7511d0a29f3e-operator-scripts\") pod \"placement-fc8f-account-create-update-vfrcv\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.771175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2sj\" (UniqueName: \"kubernetes.io/projected/045ab09f-8170-4815-8af2-7511d0a29f3e-kube-api-access-cb2sj\") pod \"placement-fc8f-account-create-update-vfrcv\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.780202 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-75qq2" event={"ID":"be146b59-8dd7-4274-aa00-9d9872fddf6e","Type":"ContainerDied","Data":"06c4bba9b48c6cf9a7a40939e7d09f87af2963976d103f0bbf65ad5f5a1befc0"} Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.780248 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c4bba9b48c6cf9a7a40939e7d09f87af2963976d103f0bbf65ad5f5a1befc0" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.780327 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75qq2" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.782526 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rms66" Feb 21 22:01:47 crc kubenswrapper[4717]: I0221 22:01:47.843818 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.010440 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wcf2g"] Feb 21 22:01:48 crc kubenswrapper[4717]: W0221 22:01:48.014105 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b97d305_a098_430c_9afd_36a981d3f978.slice/crio-08990f0beca5b5f1d3ee7cbde14dc4e53035f182a5374457fef22ee5b45fb730 WatchSource:0}: Error finding container 08990f0beca5b5f1d3ee7cbde14dc4e53035f182a5374457fef22ee5b45fb730: Status 404 returned error can't find the container with id 08990f0beca5b5f1d3ee7cbde14dc4e53035f182a5374457fef22ee5b45fb730 Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.128019 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0e51-account-create-update-xtqdq"] Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.224274 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rms66"] Feb 21 22:01:48 crc kubenswrapper[4717]: W0221 22:01:48.246067 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod362d581c_08cc_41a5_9327_209994947262.slice/crio-3558bb7c6aa5d65b018e142d3e111e044301fedb4dd5a3d5619e980daf543e33 WatchSource:0}: Error finding container 3558bb7c6aa5d65b018e142d3e111e044301fedb4dd5a3d5619e980daf543e33: Status 404 returned error can't find the container with id 3558bb7c6aa5d65b018e142d3e111e044301fedb4dd5a3d5619e980daf543e33 Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.382668 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc8f-account-create-update-vfrcv"] Feb 21 22:01:48 crc kubenswrapper[4717]: W0221 22:01:48.388563 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod045ab09f_8170_4815_8af2_7511d0a29f3e.slice/crio-92dec08b5ada884ec7996ae4bfcb351d25f695d7814174b1318e40f46d6c9faa WatchSource:0}: Error finding container 92dec08b5ada884ec7996ae4bfcb351d25f695d7814174b1318e40f46d6c9faa: Status 404 returned error can't find the container with id 92dec08b5ada884ec7996ae4bfcb351d25f695d7814174b1318e40f46d6c9faa Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.514010 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.618085 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.676684 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s78gk"] Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.676944 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerName="dnsmasq-dns" containerID="cri-o://b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894" gracePeriod=10 Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.791068 4717 generic.go:334] "Generic (PLEG): container finished" podID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerID="f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54" exitCode=0 Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.791160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncbrn" event={"ID":"6bed5ac2-5539-4722-9432-1ee6618783bc","Type":"ContainerDied","Data":"f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.792565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0e51-account-create-update-xtqdq" event={"ID":"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7","Type":"ContainerStarted","Data":"417b05e1b9da0cce1fb21ba9078fac3a79e4d41ca454e02a59213ad77b3cb3c1"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.792599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0e51-account-create-update-xtqdq" event={"ID":"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7","Type":"ContainerStarted","Data":"e93f2a20955d4758969d04abc39c52dad121c8a732dae9e33111229bc03881ea"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.794802 4717 generic.go:334] "Generic (PLEG): container finished" podID="4b97d305-a098-430c-9afd-36a981d3f978" containerID="03287299a3fb78fd55b94a2243cf8487ef5a67bb62539551a3c8485918e6ca3e" exitCode=0 Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.794938 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wcf2g" event={"ID":"4b97d305-a098-430c-9afd-36a981d3f978","Type":"ContainerDied","Data":"03287299a3fb78fd55b94a2243cf8487ef5a67bb62539551a3c8485918e6ca3e"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.794961 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wcf2g" event={"ID":"4b97d305-a098-430c-9afd-36a981d3f978","Type":"ContainerStarted","Data":"08990f0beca5b5f1d3ee7cbde14dc4e53035f182a5374457fef22ee5b45fb730"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.801836 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc8f-account-create-update-vfrcv" event={"ID":"045ab09f-8170-4815-8af2-7511d0a29f3e","Type":"ContainerStarted","Data":"b01bbd660892dea72dca718004e62bd76fcfb0330af74253832932721da9a9ea"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.802171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc8f-account-create-update-vfrcv" event={"ID":"045ab09f-8170-4815-8af2-7511d0a29f3e","Type":"ContainerStarted","Data":"92dec08b5ada884ec7996ae4bfcb351d25f695d7814174b1318e40f46d6c9faa"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.818628 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rms66" event={"ID":"362d581c-08cc-41a5-9327-209994947262","Type":"ContainerStarted","Data":"6a7d6872e80a0e2c9b9165696c47af427bc8a25cec42384f6ae03706d9ae6e32"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.818671 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rms66" event={"ID":"362d581c-08cc-41a5-9327-209994947262","Type":"ContainerStarted","Data":"3558bb7c6aa5d65b018e142d3e111e044301fedb4dd5a3d5619e980daf543e33"} Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.850324 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0e51-account-create-update-xtqdq" podStartSLOduration=1.850308981 podStartE2EDuration="1.850308981s" podCreationTimestamp="2026-02-21 22:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:48.840324202 +0000 UTC m=+923.621857824" watchObservedRunningTime="2026-02-21 22:01:48.850308981 +0000 UTC m=+923.631842603" Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.867574 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fc8f-account-create-update-vfrcv" podStartSLOduration=1.867551514 podStartE2EDuration="1.867551514s" podCreationTimestamp="2026-02-21 22:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:48.864804688 +0000 UTC m=+923.646338320" watchObservedRunningTime="2026-02-21 22:01:48.867551514 +0000 UTC m=+923.649085136" Feb 21 22:01:48 crc kubenswrapper[4717]: I0221 22:01:48.892371 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-rms66" podStartSLOduration=1.8923503080000001 podStartE2EDuration="1.892350308s" podCreationTimestamp="2026-02-21 22:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:48.891096338 +0000 UTC m=+923.672629960" watchObservedRunningTime="2026-02-21 22:01:48.892350308 +0000 UTC m=+923.673883930" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.037676 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nqq7s"] Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.040081 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.050237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqq7s"] Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.092921 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.108985 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-nb\") pod \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfhm9\" (UniqueName: \"kubernetes.io/projected/192e0436-8d26-476c-b6fc-d2d43f26dc1a-kube-api-access-nfhm9\") pod \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109107 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-config\") pod \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-dns-svc\") pod \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109233 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-sb\") pod \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\" (UID: \"192e0436-8d26-476c-b6fc-d2d43f26dc1a\") " Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fjfg\" (UniqueName: \"kubernetes.io/projected/db87482a-6aa4-49f8-ac16-b2aa288196ff-kube-api-access-8fjfg\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109449 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-catalog-content\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.109543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-utilities\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.118127 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192e0436-8d26-476c-b6fc-d2d43f26dc1a-kube-api-access-nfhm9" (OuterVolumeSpecName: "kube-api-access-nfhm9") pod "192e0436-8d26-476c-b6fc-d2d43f26dc1a" (UID: "192e0436-8d26-476c-b6fc-d2d43f26dc1a"). InnerVolumeSpecName "kube-api-access-nfhm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.151705 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "192e0436-8d26-476c-b6fc-d2d43f26dc1a" (UID: "192e0436-8d26-476c-b6fc-d2d43f26dc1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.178538 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "192e0436-8d26-476c-b6fc-d2d43f26dc1a" (UID: "192e0436-8d26-476c-b6fc-d2d43f26dc1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.196147 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-config" (OuterVolumeSpecName: "config") pod "192e0436-8d26-476c-b6fc-d2d43f26dc1a" (UID: "192e0436-8d26-476c-b6fc-d2d43f26dc1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.201518 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "192e0436-8d26-476c-b6fc-d2d43f26dc1a" (UID: "192e0436-8d26-476c-b6fc-d2d43f26dc1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.210895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjfg\" (UniqueName: \"kubernetes.io/projected/db87482a-6aa4-49f8-ac16-b2aa288196ff-kube-api-access-8fjfg\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.210961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-catalog-content\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211061 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-utilities\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211131 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfhm9\" (UniqueName: \"kubernetes.io/projected/192e0436-8d26-476c-b6fc-d2d43f26dc1a-kube-api-access-nfhm9\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211142 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211152 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211160 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211168 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/192e0436-8d26-476c-b6fc-d2d43f26dc1a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.211614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-utilities\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.212260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-catalog-content\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.237450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fjfg\" (UniqueName: \"kubernetes.io/projected/db87482a-6aa4-49f8-ac16-b2aa288196ff-kube-api-access-8fjfg\") pod \"redhat-marketplace-nqq7s\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.392790 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.827255 4717 generic.go:334] "Generic (PLEG): container finished" podID="517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" containerID="417b05e1b9da0cce1fb21ba9078fac3a79e4d41ca454e02a59213ad77b3cb3c1" exitCode=0 Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.827413 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0e51-account-create-update-xtqdq" event={"ID":"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7","Type":"ContainerDied","Data":"417b05e1b9da0cce1fb21ba9078fac3a79e4d41ca454e02a59213ad77b3cb3c1"} Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.828783 4717 generic.go:334] "Generic (PLEG): container finished" podID="045ab09f-8170-4815-8af2-7511d0a29f3e" containerID="b01bbd660892dea72dca718004e62bd76fcfb0330af74253832932721da9a9ea" exitCode=0 Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.828846 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc8f-account-create-update-vfrcv" event={"ID":"045ab09f-8170-4815-8af2-7511d0a29f3e","Type":"ContainerDied","Data":"b01bbd660892dea72dca718004e62bd76fcfb0330af74253832932721da9a9ea"} Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.830135 4717 generic.go:334] "Generic (PLEG): container finished" podID="362d581c-08cc-41a5-9327-209994947262" containerID="6a7d6872e80a0e2c9b9165696c47af427bc8a25cec42384f6ae03706d9ae6e32" exitCode=0 Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.830192 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rms66" event={"ID":"362d581c-08cc-41a5-9327-209994947262","Type":"ContainerDied","Data":"6a7d6872e80a0e2c9b9165696c47af427bc8a25cec42384f6ae03706d9ae6e32"} Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.831523 4717 generic.go:334] "Generic (PLEG): container finished" podID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerID="b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894" exitCode=0 Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.831565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" event={"ID":"192e0436-8d26-476c-b6fc-d2d43f26dc1a","Type":"ContainerDied","Data":"b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894"} Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.831581 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" event={"ID":"192e0436-8d26-476c-b6fc-d2d43f26dc1a","Type":"ContainerDied","Data":"cce40c0b6d820fb759bd591d21498f4d33f0125add8b740f25555a0d58618d7f"} Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.831597 4717 scope.go:117] "RemoveContainer" containerID="b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.831697 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-s78gk" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.837946 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncbrn" event={"ID":"6bed5ac2-5539-4722-9432-1ee6618783bc","Type":"ContainerStarted","Data":"590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1"} Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.905673 4717 scope.go:117] "RemoveContainer" containerID="2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.911233 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqq7s"] Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.932316 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s78gk"] Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.938467 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-s78gk"] Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.945089 4717 scope.go:117] "RemoveContainer" containerID="b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.946216 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ncbrn" podStartSLOduration=4.45126861 podStartE2EDuration="7.946199742s" podCreationTimestamp="2026-02-21 22:01:42 +0000 UTC" firstStartedPulling="2026-02-21 22:01:45.731651425 +0000 UTC m=+920.513185047" lastFinishedPulling="2026-02-21 22:01:49.226582557 +0000 UTC m=+924.008116179" observedRunningTime="2026-02-21 22:01:49.927382361 +0000 UTC m=+924.708916003" watchObservedRunningTime="2026-02-21 22:01:49.946199742 +0000 UTC m=+924.727733364" Feb 21 22:01:49 crc kubenswrapper[4717]: E0221 22:01:49.948284 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894\": container with ID starting with b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894 not found: ID does not exist" containerID="b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.948318 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894"} err="failed to get container status \"b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894\": rpc error: code = NotFound desc = could not find container \"b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894\": container with ID starting with b594b51b788b480bcf13945f6306f2bb70e293a75705c75946b784eb22716894 not found: ID does not exist" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.948343 4717 scope.go:117] "RemoveContainer" containerID="2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e" Feb 21 22:01:49 crc kubenswrapper[4717]: E0221 22:01:49.950110 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e\": container with ID starting with 2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e not found: ID does not exist" containerID="2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.950152 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e"} err="failed to get container status \"2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e\": rpc error: code = NotFound desc = could not find container \"2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e\": container with ID starting with 2f7e07700bb3ec5941cda3209ad8b1900c81207289286c996b70aed4e3cbff7e not found: ID does not exist" Feb 21 22:01:49 crc kubenswrapper[4717]: I0221 22:01:49.994064 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" path="/var/lib/kubelet/pods/192e0436-8d26-476c-b6fc-d2d43f26dc1a/volumes" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.295215 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.338163 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gskvg\" (UniqueName: \"kubernetes.io/projected/4b97d305-a098-430c-9afd-36a981d3f978-kube-api-access-gskvg\") pod \"4b97d305-a098-430c-9afd-36a981d3f978\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.338343 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b97d305-a098-430c-9afd-36a981d3f978-operator-scripts\") pod \"4b97d305-a098-430c-9afd-36a981d3f978\" (UID: \"4b97d305-a098-430c-9afd-36a981d3f978\") " Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.339089 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b97d305-a098-430c-9afd-36a981d3f978-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b97d305-a098-430c-9afd-36a981d3f978" (UID: "4b97d305-a098-430c-9afd-36a981d3f978"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.365533 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b97d305-a098-430c-9afd-36a981d3f978-kube-api-access-gskvg" (OuterVolumeSpecName: "kube-api-access-gskvg") pod "4b97d305-a098-430c-9afd-36a981d3f978" (UID: "4b97d305-a098-430c-9afd-36a981d3f978"). InnerVolumeSpecName "kube-api-access-gskvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.440283 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gskvg\" (UniqueName: \"kubernetes.io/projected/4b97d305-a098-430c-9afd-36a981d3f978-kube-api-access-gskvg\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.440318 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b97d305-a098-430c-9afd-36a981d3f978-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.848218 4717 generic.go:334] "Generic (PLEG): container finished" podID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerID="bd660dac1eafa5f13673af9c9a7cc1f785e4b4ee09500acb579efe9a4221db46" exitCode=0 Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.849239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerDied","Data":"bd660dac1eafa5f13673af9c9a7cc1f785e4b4ee09500acb579efe9a4221db46"} Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.849286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerStarted","Data":"a6a14e016e7107e99c670a5f051485d8434b9ff07d11cdd99bb602552408de43"} Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.851437 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wcf2g" event={"ID":"4b97d305-a098-430c-9afd-36a981d3f978","Type":"ContainerDied","Data":"08990f0beca5b5f1d3ee7cbde14dc4e53035f182a5374457fef22ee5b45fb730"} Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.851462 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08990f0beca5b5f1d3ee7cbde14dc4e53035f182a5374457fef22ee5b45fb730" Feb 21 22:01:50 crc kubenswrapper[4717]: I0221 22:01:50.851537 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wcf2g" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.251647 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5lppt"] Feb 21 22:01:51 crc kubenswrapper[4717]: E0221 22:01:51.251923 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b97d305-a098-430c-9afd-36a981d3f978" containerName="mariadb-database-create" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.251935 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b97d305-a098-430c-9afd-36a981d3f978" containerName="mariadb-database-create" Feb 21 22:01:51 crc kubenswrapper[4717]: E0221 22:01:51.251957 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerName="init" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.251963 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerName="init" Feb 21 22:01:51 crc kubenswrapper[4717]: E0221 22:01:51.251974 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerName="dnsmasq-dns" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.251981 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerName="dnsmasq-dns" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.252139 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b97d305-a098-430c-9afd-36a981d3f978" containerName="mariadb-database-create" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.252155 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="192e0436-8d26-476c-b6fc-d2d43f26dc1a" containerName="dnsmasq-dns" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.252583 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.259615 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5lppt"] Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.367796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c8b56e1-15ef-43f4-bc49-3dfe18978736-operator-scripts\") pod \"glance-db-create-5lppt\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.368305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/6c8b56e1-15ef-43f4-bc49-3dfe18978736-kube-api-access-4t282\") pod \"glance-db-create-5lppt\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.468699 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9b8b-account-create-update-gf86p"] Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.469774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.471824 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.473634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/6c8b56e1-15ef-43f4-bc49-3dfe18978736-kube-api-access-4t282\") pod \"glance-db-create-5lppt\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.473714 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c8b56e1-15ef-43f4-bc49-3dfe18978736-operator-scripts\") pod \"glance-db-create-5lppt\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.480929 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9b8b-account-create-update-gf86p"] Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.526886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c8b56e1-15ef-43f4-bc49-3dfe18978736-operator-scripts\") pod \"glance-db-create-5lppt\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.540874 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/6c8b56e1-15ef-43f4-bc49-3dfe18978736-kube-api-access-4t282\") pod \"glance-db-create-5lppt\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.574994 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6fp6\" (UniqueName: \"kubernetes.io/projected/8357efe6-d264-4a56-902a-b7e443b93ac7-kube-api-access-x6fp6\") pod \"glance-9b8b-account-create-update-gf86p\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.575057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8357efe6-d264-4a56-902a-b7e443b93ac7-operator-scripts\") pod \"glance-9b8b-account-create-update-gf86p\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.644463 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.658360 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5lppt" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.677149 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-operator-scripts\") pod \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.677246 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt2dj\" (UniqueName: \"kubernetes.io/projected/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-kube-api-access-vt2dj\") pod \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\" (UID: \"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7\") " Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.677467 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6fp6\" (UniqueName: \"kubernetes.io/projected/8357efe6-d264-4a56-902a-b7e443b93ac7-kube-api-access-x6fp6\") pod \"glance-9b8b-account-create-update-gf86p\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.677512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8357efe6-d264-4a56-902a-b7e443b93ac7-operator-scripts\") pod \"glance-9b8b-account-create-update-gf86p\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.677812 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" (UID: "517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.678321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8357efe6-d264-4a56-902a-b7e443b93ac7-operator-scripts\") pod \"glance-9b8b-account-create-update-gf86p\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.683596 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-kube-api-access-vt2dj" (OuterVolumeSpecName: "kube-api-access-vt2dj") pod "517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" (UID: "517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7"). InnerVolumeSpecName "kube-api-access-vt2dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.698644 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6fp6\" (UniqueName: \"kubernetes.io/projected/8357efe6-d264-4a56-902a-b7e443b93ac7-kube-api-access-x6fp6\") pod \"glance-9b8b-account-create-update-gf86p\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.748304 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rms66" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.787103 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt2dj\" (UniqueName: \"kubernetes.io/projected/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-kube-api-access-vt2dj\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.787238 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.827669 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.870824 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rms66" event={"ID":"362d581c-08cc-41a5-9327-209994947262","Type":"ContainerDied","Data":"3558bb7c6aa5d65b018e142d3e111e044301fedb4dd5a3d5619e980daf543e33"} Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.870902 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3558bb7c6aa5d65b018e142d3e111e044301fedb4dd5a3d5619e980daf543e33" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.871003 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rms66" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.874309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0e51-account-create-update-xtqdq" event={"ID":"517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7","Type":"ContainerDied","Data":"e93f2a20955d4758969d04abc39c52dad121c8a732dae9e33111229bc03881ea"} Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.874332 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e93f2a20955d4758969d04abc39c52dad121c8a732dae9e33111229bc03881ea" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.874392 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0e51-account-create-update-xtqdq" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.890307 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpgrx\" (UniqueName: \"kubernetes.io/projected/362d581c-08cc-41a5-9327-209994947262-kube-api-access-bpgrx\") pod \"362d581c-08cc-41a5-9327-209994947262\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.890563 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362d581c-08cc-41a5-9327-209994947262-operator-scripts\") pod \"362d581c-08cc-41a5-9327-209994947262\" (UID: \"362d581c-08cc-41a5-9327-209994947262\") " Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.891234 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362d581c-08cc-41a5-9327-209994947262-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "362d581c-08cc-41a5-9327-209994947262" (UID: "362d581c-08cc-41a5-9327-209994947262"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.894265 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362d581c-08cc-41a5-9327-209994947262-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.906248 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362d581c-08cc-41a5-9327-209994947262-kube-api-access-bpgrx" (OuterVolumeSpecName: "kube-api-access-bpgrx") pod "362d581c-08cc-41a5-9327-209994947262" (UID: "362d581c-08cc-41a5-9327-209994947262"). InnerVolumeSpecName "kube-api-access-bpgrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.927031 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.994956 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb2sj\" (UniqueName: \"kubernetes.io/projected/045ab09f-8170-4815-8af2-7511d0a29f3e-kube-api-access-cb2sj\") pod \"045ab09f-8170-4815-8af2-7511d0a29f3e\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.995338 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/045ab09f-8170-4815-8af2-7511d0a29f3e-operator-scripts\") pod \"045ab09f-8170-4815-8af2-7511d0a29f3e\" (UID: \"045ab09f-8170-4815-8af2-7511d0a29f3e\") " Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.995767 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpgrx\" (UniqueName: \"kubernetes.io/projected/362d581c-08cc-41a5-9327-209994947262-kube-api-access-bpgrx\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:51 crc kubenswrapper[4717]: I0221 22:01:51.996161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045ab09f-8170-4815-8af2-7511d0a29f3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "045ab09f-8170-4815-8af2-7511d0a29f3e" (UID: "045ab09f-8170-4815-8af2-7511d0a29f3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.000419 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5lppt"] Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.001665 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045ab09f-8170-4815-8af2-7511d0a29f3e-kube-api-access-cb2sj" (OuterVolumeSpecName: "kube-api-access-cb2sj") pod "045ab09f-8170-4815-8af2-7511d0a29f3e" (UID: "045ab09f-8170-4815-8af2-7511d0a29f3e"). InnerVolumeSpecName "kube-api-access-cb2sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:52 crc kubenswrapper[4717]: W0221 22:01:52.009428 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c8b56e1_15ef_43f4_bc49_3dfe18978736.slice/crio-40fdd25e79df5f942a16a8fab375aa13ae17e6d745a34d9da8760b4a7ffaafe3 WatchSource:0}: Error finding container 40fdd25e79df5f942a16a8fab375aa13ae17e6d745a34d9da8760b4a7ffaafe3: Status 404 returned error can't find the container with id 40fdd25e79df5f942a16a8fab375aa13ae17e6d745a34d9da8760b4a7ffaafe3 Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.098096 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb2sj\" (UniqueName: \"kubernetes.io/projected/045ab09f-8170-4815-8af2-7511d0a29f3e-kube-api-access-cb2sj\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.098126 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/045ab09f-8170-4815-8af2-7511d0a29f3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.274400 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9b8b-account-create-update-gf86p"] Feb 21 22:01:52 crc kubenswrapper[4717]: W0221 22:01:52.280636 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8357efe6_d264_4a56_902a_b7e443b93ac7.slice/crio-422489db18d5b34811ee4c048349d2a558b5f851e937c4f6cc47306d87b594d5 WatchSource:0}: Error finding container 422489db18d5b34811ee4c048349d2a558b5f851e937c4f6cc47306d87b594d5: Status 404 returned error can't find the container with id 422489db18d5b34811ee4c048349d2a558b5f851e937c4f6cc47306d87b594d5 Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.523546 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.661474 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.661532 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.885357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc8f-account-create-update-vfrcv" event={"ID":"045ab09f-8170-4815-8af2-7511d0a29f3e","Type":"ContainerDied","Data":"92dec08b5ada884ec7996ae4bfcb351d25f695d7814174b1318e40f46d6c9faa"} Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.885417 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92dec08b5ada884ec7996ae4bfcb351d25f695d7814174b1318e40f46d6c9faa" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.885372 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc8f-account-create-update-vfrcv" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.887313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8b-account-create-update-gf86p" event={"ID":"8357efe6-d264-4a56-902a-b7e443b93ac7","Type":"ContainerStarted","Data":"178405d965dfcf5114b5e461d8ca83be0a88edcc7fde6c38357ea3a03e042d43"} Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.887382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8b-account-create-update-gf86p" event={"ID":"8357efe6-d264-4a56-902a-b7e443b93ac7","Type":"ContainerStarted","Data":"422489db18d5b34811ee4c048349d2a558b5f851e937c4f6cc47306d87b594d5"} Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.888917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5lppt" event={"ID":"6c8b56e1-15ef-43f4-bc49-3dfe18978736","Type":"ContainerStarted","Data":"dd54f39fffd116541fe63ed6f01871dda4f9451975d8377afbd79a1fd91e372c"} Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.888948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5lppt" event={"ID":"6c8b56e1-15ef-43f4-bc49-3dfe18978736","Type":"ContainerStarted","Data":"40fdd25e79df5f942a16a8fab375aa13ae17e6d745a34d9da8760b4a7ffaafe3"} Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.890462 4717 generic.go:334] "Generic (PLEG): container finished" podID="9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" containerID="8da67497e05dfe26e54c5054fb90be037db085c2113602bb1f1601030f98c4c5" exitCode=0 Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.890511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5zc8b" event={"ID":"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8","Type":"ContainerDied","Data":"8da67497e05dfe26e54c5054fb90be037db085c2113602bb1f1601030f98c4c5"} Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.907850 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-75qq2"] Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.915247 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9b8b-account-create-update-gf86p" podStartSLOduration=1.915237082 podStartE2EDuration="1.915237082s" podCreationTimestamp="2026-02-21 22:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:52.910754475 +0000 UTC m=+927.692288097" watchObservedRunningTime="2026-02-21 22:01:52.915237082 +0000 UTC m=+927.696770704" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.920090 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-75qq2"] Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.958830 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-5lppt" podStartSLOduration=1.958803746 podStartE2EDuration="1.958803746s" podCreationTimestamp="2026-02-21 22:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:52.933389057 +0000 UTC m=+927.714922669" watchObservedRunningTime="2026-02-21 22:01:52.958803746 +0000 UTC m=+927.740337368" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967176 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2w6q9"] Feb 21 22:01:52 crc kubenswrapper[4717]: E0221 22:01:52.967619 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" containerName="mariadb-account-create-update" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967636 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" containerName="mariadb-account-create-update" Feb 21 22:01:52 crc kubenswrapper[4717]: E0221 22:01:52.967664 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045ab09f-8170-4815-8af2-7511d0a29f3e" containerName="mariadb-account-create-update" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967673 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="045ab09f-8170-4815-8af2-7511d0a29f3e" containerName="mariadb-account-create-update" Feb 21 22:01:52 crc kubenswrapper[4717]: E0221 22:01:52.967685 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362d581c-08cc-41a5-9327-209994947262" containerName="mariadb-database-create" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967691 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="362d581c-08cc-41a5-9327-209994947262" containerName="mariadb-database-create" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967847 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="362d581c-08cc-41a5-9327-209994947262" containerName="mariadb-database-create" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967876 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="045ab09f-8170-4815-8af2-7511d0a29f3e" containerName="mariadb-account-create-update" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.967889 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" containerName="mariadb-account-create-update" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.968469 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.970634 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 21 22:01:52 crc kubenswrapper[4717]: I0221 22:01:52.978803 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2w6q9"] Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.120331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vj5\" (UniqueName: \"kubernetes.io/projected/03fd9592-8cf7-4fda-a394-f0ad5efe2397-kube-api-access-p6vj5\") pod \"root-account-create-update-2w6q9\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.120757 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03fd9592-8cf7-4fda-a394-f0ad5efe2397-operator-scripts\") pod \"root-account-create-update-2w6q9\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.221985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vj5\" (UniqueName: \"kubernetes.io/projected/03fd9592-8cf7-4fda-a394-f0ad5efe2397-kube-api-access-p6vj5\") pod \"root-account-create-update-2w6q9\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.222084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03fd9592-8cf7-4fda-a394-f0ad5efe2397-operator-scripts\") pod \"root-account-create-update-2w6q9\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.222734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03fd9592-8cf7-4fda-a394-f0ad5efe2397-operator-scripts\") pod \"root-account-create-update-2w6q9\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.247543 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vj5\" (UniqueName: \"kubernetes.io/projected/03fd9592-8cf7-4fda-a394-f0ad5efe2397-kube-api-access-p6vj5\") pod \"root-account-create-update-2w6q9\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.285299 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.704576 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ncbrn" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="registry-server" probeResult="failure" output=< Feb 21 22:01:53 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 22:01:53 crc kubenswrapper[4717]: > Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.866900 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2w6q9"] Feb 21 22:01:53 crc kubenswrapper[4717]: W0221 22:01:53.867207 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03fd9592_8cf7_4fda_a394_f0ad5efe2397.slice/crio-2f00f3ed17db5a3be44e3750ee104eee2fb92c39c8822fb4d4d2d62d63aa7d02 WatchSource:0}: Error finding container 2f00f3ed17db5a3be44e3750ee104eee2fb92c39c8822fb4d4d2d62d63aa7d02: Status 404 returned error can't find the container with id 2f00f3ed17db5a3be44e3750ee104eee2fb92c39c8822fb4d4d2d62d63aa7d02 Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.922691 4717 generic.go:334] "Generic (PLEG): container finished" podID="6c8b56e1-15ef-43f4-bc49-3dfe18978736" containerID="dd54f39fffd116541fe63ed6f01871dda4f9451975d8377afbd79a1fd91e372c" exitCode=0 Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.922803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5lppt" event={"ID":"6c8b56e1-15ef-43f4-bc49-3dfe18978736","Type":"ContainerDied","Data":"dd54f39fffd116541fe63ed6f01871dda4f9451975d8377afbd79a1fd91e372c"} Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.926285 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerStarted","Data":"150c4eaeb07ccfee3fdcf7e736f2b0442837add1f56e04a5701a182c52de38c5"} Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.931176 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2w6q9" event={"ID":"03fd9592-8cf7-4fda-a394-f0ad5efe2397","Type":"ContainerStarted","Data":"2f00f3ed17db5a3be44e3750ee104eee2fb92c39c8822fb4d4d2d62d63aa7d02"} Feb 21 22:01:53 crc kubenswrapper[4717]: I0221 22:01:53.991948 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be146b59-8dd7-4274-aa00-9d9872fddf6e" path="/var/lib/kubelet/pods/be146b59-8dd7-4274-aa00-9d9872fddf6e/volumes" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.286443 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.351579 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-etc-swift\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.351640 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-combined-ca-bundle\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.351667 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-ring-data-devices\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.351692 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrqt2\" (UniqueName: \"kubernetes.io/projected/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-kube-api-access-xrqt2\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.351731 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-swiftconf\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.351775 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-scripts\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.353425 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.354602 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.359199 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-kube-api-access-xrqt2" (OuterVolumeSpecName: "kube-api-access-xrqt2") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "kube-api-access-xrqt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.377672 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.377690 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.381191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-scripts" (OuterVolumeSpecName: "scripts") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453065 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-dispersionconf\") pod \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\" (UID: \"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8\") " Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453355 4717 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453371 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453381 4717 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453392 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrqt2\" (UniqueName: \"kubernetes.io/projected/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-kube-api-access-xrqt2\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453400 4717 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.453407 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.458672 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" (UID: "9d1b5d67-1e8c-4c1f-a6a3-9634827165f8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.554727 4717 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d1b5d67-1e8c-4c1f-a6a3-9634827165f8-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.938641 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2w6q9" event={"ID":"03fd9592-8cf7-4fda-a394-f0ad5efe2397","Type":"ContainerStarted","Data":"fc7cd4ee13db69b9236e1f63d473984ae556a720a7e31a844c32a4ad4f852bf8"} Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.941984 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5zc8b" event={"ID":"9d1b5d67-1e8c-4c1f-a6a3-9634827165f8","Type":"ContainerDied","Data":"4e0d27e956eb54c12c1290759b95385954f082780fe54bdea0c766502bdb1c83"} Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.942019 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e0d27e956eb54c12c1290759b95385954f082780fe54bdea0c766502bdb1c83" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.942042 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5zc8b" Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.944235 4717 generic.go:334] "Generic (PLEG): container finished" podID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerID="150c4eaeb07ccfee3fdcf7e736f2b0442837add1f56e04a5701a182c52de38c5" exitCode=0 Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.944344 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerDied","Data":"150c4eaeb07ccfee3fdcf7e736f2b0442837add1f56e04a5701a182c52de38c5"} Feb 21 22:01:54 crc kubenswrapper[4717]: I0221 22:01:54.969027 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2w6q9" podStartSLOduration=2.969005088 podStartE2EDuration="2.969005088s" podCreationTimestamp="2026-02-21 22:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:01:54.964370176 +0000 UTC m=+929.745903808" watchObservedRunningTime="2026-02-21 22:01:54.969005088 +0000 UTC m=+929.750538710" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.331242 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5lppt" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.434876 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-thlb8"] Feb 21 22:01:55 crc kubenswrapper[4717]: E0221 22:01:55.435213 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8b56e1-15ef-43f4-bc49-3dfe18978736" containerName="mariadb-database-create" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.435232 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8b56e1-15ef-43f4-bc49-3dfe18978736" containerName="mariadb-database-create" Feb 21 22:01:55 crc kubenswrapper[4717]: E0221 22:01:55.435256 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" containerName="swift-ring-rebalance" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.435262 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" containerName="swift-ring-rebalance" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.435396 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8b56e1-15ef-43f4-bc49-3dfe18978736" containerName="mariadb-database-create" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.435407 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1b5d67-1e8c-4c1f-a6a3-9634827165f8" containerName="swift-ring-rebalance" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.436528 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.450815 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thlb8"] Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.474819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c8b56e1-15ef-43f4-bc49-3dfe18978736-operator-scripts\") pod \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.474869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/6c8b56e1-15ef-43f4-bc49-3dfe18978736-kube-api-access-4t282\") pod \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\" (UID: \"6c8b56e1-15ef-43f4-bc49-3dfe18978736\") " Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.475040 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.477158 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8b56e1-15ef-43f4-bc49-3dfe18978736-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c8b56e1-15ef-43f4-bc49-3dfe18978736" (UID: "6c8b56e1-15ef-43f4-bc49-3dfe18978736"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.486832 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fc309c6-44f4-4daf-90fa-6bf6845f195d-etc-swift\") pod \"swift-storage-0\" (UID: \"1fc309c6-44f4-4daf-90fa-6bf6845f195d\") " pod="openstack/swift-storage-0" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.499074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8b56e1-15ef-43f4-bc49-3dfe18978736-kube-api-access-4t282" (OuterVolumeSpecName: "kube-api-access-4t282") pod "6c8b56e1-15ef-43f4-bc49-3dfe18978736" (UID: "6c8b56e1-15ef-43f4-bc49-3dfe18978736"). InnerVolumeSpecName "kube-api-access-4t282". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.517883 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.579998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl86r\" (UniqueName: \"kubernetes.io/projected/e714f3eb-3dd6-4325-a30c-87fd3730116f-kube-api-access-sl86r\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.580392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-catalog-content\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.580503 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-utilities\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.580598 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c8b56e1-15ef-43f4-bc49-3dfe18978736-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.580614 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t282\" (UniqueName: \"kubernetes.io/projected/6c8b56e1-15ef-43f4-bc49-3dfe18978736-kube-api-access-4t282\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.681292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl86r\" (UniqueName: \"kubernetes.io/projected/e714f3eb-3dd6-4325-a30c-87fd3730116f-kube-api-access-sl86r\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.681394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-catalog-content\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.681431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-utilities\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.682236 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-utilities\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.682514 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-catalog-content\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.702936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl86r\" (UniqueName: \"kubernetes.io/projected/e714f3eb-3dd6-4325-a30c-87fd3730116f-kube-api-access-sl86r\") pod \"certified-operators-thlb8\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:55 crc kubenswrapper[4717]: I0221 22:01:55.842344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.013036 4717 generic.go:334] "Generic (PLEG): container finished" podID="8357efe6-d264-4a56-902a-b7e443b93ac7" containerID="178405d965dfcf5114b5e461d8ca83be0a88edcc7fde6c38357ea3a03e042d43" exitCode=0 Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.013217 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8b-account-create-update-gf86p" event={"ID":"8357efe6-d264-4a56-902a-b7e443b93ac7","Type":"ContainerDied","Data":"178405d965dfcf5114b5e461d8ca83be0a88edcc7fde6c38357ea3a03e042d43"} Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.019239 4717 generic.go:334] "Generic (PLEG): container finished" podID="03fd9592-8cf7-4fda-a394-f0ad5efe2397" containerID="fc7cd4ee13db69b9236e1f63d473984ae556a720a7e31a844c32a4ad4f852bf8" exitCode=0 Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.019344 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2w6q9" event={"ID":"03fd9592-8cf7-4fda-a394-f0ad5efe2397","Type":"ContainerDied","Data":"fc7cd4ee13db69b9236e1f63d473984ae556a720a7e31a844c32a4ad4f852bf8"} Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.032130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5lppt" event={"ID":"6c8b56e1-15ef-43f4-bc49-3dfe18978736","Type":"ContainerDied","Data":"40fdd25e79df5f942a16a8fab375aa13ae17e6d745a34d9da8760b4a7ffaafe3"} Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.032172 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40fdd25e79df5f942a16a8fab375aa13ae17e6d745a34d9da8760b4a7ffaafe3" Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.032226 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5lppt" Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.121943 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 21 22:01:56 crc kubenswrapper[4717]: W0221 22:01:56.130585 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc309c6_44f4_4daf_90fa_6bf6845f195d.slice/crio-c6ab26f82d5d82940cdffaaedeef5bed750573fb411b2fb41c681eb5c4764c33 WatchSource:0}: Error finding container c6ab26f82d5d82940cdffaaedeef5bed750573fb411b2fb41c681eb5c4764c33: Status 404 returned error can't find the container with id c6ab26f82d5d82940cdffaaedeef5bed750573fb411b2fb41c681eb5c4764c33 Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.320233 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thlb8"] Feb 21 22:01:56 crc kubenswrapper[4717]: W0221 22:01:56.334456 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode714f3eb_3dd6_4325_a30c_87fd3730116f.slice/crio-7c4437589b2f49754810bb88543a6c9f4cfb1c7362cd97f62727420e6a399eab WatchSource:0}: Error finding container 7c4437589b2f49754810bb88543a6c9f4cfb1c7362cd97f62727420e6a399eab: Status 404 returned error can't find the container with id 7c4437589b2f49754810bb88543a6c9f4cfb1c7362cd97f62727420e6a399eab Feb 21 22:01:56 crc kubenswrapper[4717]: I0221 22:01:56.639702 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-828xd" podUID="4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1" containerName="ovn-controller" probeResult="failure" output=< Feb 21 22:01:56 crc kubenswrapper[4717]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 21 22:01:56 crc kubenswrapper[4717]: > Feb 21 22:01:57 crc kubenswrapper[4717]: I0221 22:01:57.040894 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"c6ab26f82d5d82940cdffaaedeef5bed750573fb411b2fb41c681eb5c4764c33"} Feb 21 22:01:57 crc kubenswrapper[4717]: I0221 22:01:57.042439 4717 generic.go:334] "Generic (PLEG): container finished" podID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerID="235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018" exitCode=0 Feb 21 22:01:57 crc kubenswrapper[4717]: I0221 22:01:57.043196 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thlb8" event={"ID":"e714f3eb-3dd6-4325-a30c-87fd3730116f","Type":"ContainerDied","Data":"235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018"} Feb 21 22:01:57 crc kubenswrapper[4717]: I0221 22:01:57.043215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thlb8" event={"ID":"e714f3eb-3dd6-4325-a30c-87fd3730116f","Type":"ContainerStarted","Data":"7c4437589b2f49754810bb88543a6c9f4cfb1c7362cd97f62727420e6a399eab"} Feb 21 22:01:57 crc kubenswrapper[4717]: I0221 22:01:57.047704 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerStarted","Data":"68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4"} Feb 21 22:01:57 crc kubenswrapper[4717]: I0221 22:01:57.084316 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nqq7s" podStartSLOduration=2.186882954 podStartE2EDuration="8.084294719s" podCreationTimestamp="2026-02-21 22:01:49 +0000 UTC" firstStartedPulling="2026-02-21 22:01:50.851044276 +0000 UTC m=+925.632577898" lastFinishedPulling="2026-02-21 22:01:56.748456041 +0000 UTC m=+931.529989663" observedRunningTime="2026-02-21 22:01:57.079194966 +0000 UTC m=+931.860728588" watchObservedRunningTime="2026-02-21 22:01:57.084294719 +0000 UTC m=+931.865828351" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.057748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2w6q9" event={"ID":"03fd9592-8cf7-4fda-a394-f0ad5efe2397","Type":"ContainerDied","Data":"2f00f3ed17db5a3be44e3750ee104eee2fb92c39c8822fb4d4d2d62d63aa7d02"} Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.057998 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f00f3ed17db5a3be44e3750ee104eee2fb92c39c8822fb4d4d2d62d63aa7d02" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.059146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8b-account-create-update-gf86p" event={"ID":"8357efe6-d264-4a56-902a-b7e443b93ac7","Type":"ContainerDied","Data":"422489db18d5b34811ee4c048349d2a558b5f851e937c4f6cc47306d87b594d5"} Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.059170 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422489db18d5b34811ee4c048349d2a558b5f851e937c4f6cc47306d87b594d5" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.063889 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.071418 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.130258 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03fd9592-8cf7-4fda-a394-f0ad5efe2397-operator-scripts\") pod \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.130339 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6fp6\" (UniqueName: \"kubernetes.io/projected/8357efe6-d264-4a56-902a-b7e443b93ac7-kube-api-access-x6fp6\") pod \"8357efe6-d264-4a56-902a-b7e443b93ac7\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.130887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03fd9592-8cf7-4fda-a394-f0ad5efe2397-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03fd9592-8cf7-4fda-a394-f0ad5efe2397" (UID: "03fd9592-8cf7-4fda-a394-f0ad5efe2397"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.137468 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8357efe6-d264-4a56-902a-b7e443b93ac7-kube-api-access-x6fp6" (OuterVolumeSpecName: "kube-api-access-x6fp6") pod "8357efe6-d264-4a56-902a-b7e443b93ac7" (UID: "8357efe6-d264-4a56-902a-b7e443b93ac7"). InnerVolumeSpecName "kube-api-access-x6fp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.231652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6vj5\" (UniqueName: \"kubernetes.io/projected/03fd9592-8cf7-4fda-a394-f0ad5efe2397-kube-api-access-p6vj5\") pod \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\" (UID: \"03fd9592-8cf7-4fda-a394-f0ad5efe2397\") " Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.231753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8357efe6-d264-4a56-902a-b7e443b93ac7-operator-scripts\") pod \"8357efe6-d264-4a56-902a-b7e443b93ac7\" (UID: \"8357efe6-d264-4a56-902a-b7e443b93ac7\") " Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.232280 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03fd9592-8cf7-4fda-a394-f0ad5efe2397-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.232305 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6fp6\" (UniqueName: \"kubernetes.io/projected/8357efe6-d264-4a56-902a-b7e443b93ac7-kube-api-access-x6fp6\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.232298 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8357efe6-d264-4a56-902a-b7e443b93ac7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8357efe6-d264-4a56-902a-b7e443b93ac7" (UID: "8357efe6-d264-4a56-902a-b7e443b93ac7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.241118 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fd9592-8cf7-4fda-a394-f0ad5efe2397-kube-api-access-p6vj5" (OuterVolumeSpecName: "kube-api-access-p6vj5") pod "03fd9592-8cf7-4fda-a394-f0ad5efe2397" (UID: "03fd9592-8cf7-4fda-a394-f0ad5efe2397"). InnerVolumeSpecName "kube-api-access-p6vj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.333556 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6vj5\" (UniqueName: \"kubernetes.io/projected/03fd9592-8cf7-4fda-a394-f0ad5efe2397-kube-api-access-p6vj5\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:58 crc kubenswrapper[4717]: I0221 22:01:58.333597 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8357efe6-d264-4a56-902a-b7e443b93ac7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:01:59 crc kubenswrapper[4717]: I0221 22:01:59.081766 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8b-account-create-update-gf86p" Feb 21 22:01:59 crc kubenswrapper[4717]: I0221 22:01:59.082045 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2w6q9" Feb 21 22:01:59 crc kubenswrapper[4717]: I0221 22:01:59.394394 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:59 crc kubenswrapper[4717]: I0221 22:01:59.394454 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:01:59 crc kubenswrapper[4717]: I0221 22:01:59.494154 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.098155 4717 generic.go:334] "Generic (PLEG): container finished" podID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerID="8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29" exitCode=0 Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.098387 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thlb8" event={"ID":"e714f3eb-3dd6-4325-a30c-87fd3730116f","Type":"ContainerDied","Data":"8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29"} Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.103488 4717 generic.go:334] "Generic (PLEG): container finished" podID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerID="077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486" exitCode=0 Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.103546 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd","Type":"ContainerDied","Data":"077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486"} Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.107081 4717 generic.go:334] "Generic (PLEG): container finished" podID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerID="eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1" exitCode=0 Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.107124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2400f71f-f7db-4ed8-83aa-8427afd4dcd5","Type":"ContainerDied","Data":"eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1"} Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.110895 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"cdbd1dc71bb5f237a2cc66be7b097ea35736f181f7c3398ec3b4e84691ffc097"} Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.110980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"4d51cb5cb7d4b021b75f54a68d391ab58b196d8e0876d42eec50830512aa3663"} Feb 21 22:02:00 crc kubenswrapper[4717]: I0221 22:02:00.111005 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"6bd624ba8d89e7432c9c3e351528a61d4c5ee9e3a4b1b4250011c332390b6963"} Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.119452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd","Type":"ContainerStarted","Data":"59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5"} Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.120662 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.122079 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2400f71f-f7db-4ed8-83aa-8427afd4dcd5","Type":"ContainerStarted","Data":"25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b"} Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.122467 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.124125 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"ff0b7ae4f9967708c2e2630d5693c500f30dadb5c8749a0862e88c689ed476a9"} Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.125952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thlb8" event={"ID":"e714f3eb-3dd6-4325-a30c-87fd3730116f","Type":"ContainerStarted","Data":"2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351"} Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.153100 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.49608033 podStartE2EDuration="1m0.153080713s" podCreationTimestamp="2026-02-21 22:01:01 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.804923394 +0000 UTC m=+890.586457016" lastFinishedPulling="2026-02-21 22:01:25.461923777 +0000 UTC m=+900.243457399" observedRunningTime="2026-02-21 22:02:01.147208771 +0000 UTC m=+935.928742393" watchObservedRunningTime="2026-02-21 22:02:01.153080713 +0000 UTC m=+935.934614335" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.173340 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.417549115 podStartE2EDuration="1m0.173320327s" podCreationTimestamp="2026-02-21 22:01:01 +0000 UTC" firstStartedPulling="2026-02-21 22:01:15.357801528 +0000 UTC m=+890.139335150" lastFinishedPulling="2026-02-21 22:01:25.11357274 +0000 UTC m=+899.895106362" observedRunningTime="2026-02-21 22:02:01.168098932 +0000 UTC m=+935.949632574" watchObservedRunningTime="2026-02-21 22:02:01.173320327 +0000 UTC m=+935.954853949" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.195060 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-thlb8" podStartSLOduration=2.681496649 podStartE2EDuration="6.195039537s" podCreationTimestamp="2026-02-21 22:01:55 +0000 UTC" firstStartedPulling="2026-02-21 22:01:57.043738677 +0000 UTC m=+931.825272299" lastFinishedPulling="2026-02-21 22:02:00.557281565 +0000 UTC m=+935.338815187" observedRunningTime="2026-02-21 22:02:01.187061807 +0000 UTC m=+935.968595429" watchObservedRunningTime="2026-02-21 22:02:01.195039537 +0000 UTC m=+935.976573159" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.621546 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-828xd" podUID="4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1" containerName="ovn-controller" probeResult="failure" output=< Feb 21 22:02:01 crc kubenswrapper[4717]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 21 22:02:01 crc kubenswrapper[4717]: > Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.635588 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.664413 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sls6z" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.717628 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4ttdv"] Feb 21 22:02:01 crc kubenswrapper[4717]: E0221 22:02:01.718428 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fd9592-8cf7-4fda-a394-f0ad5efe2397" containerName="mariadb-account-create-update" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.718499 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fd9592-8cf7-4fda-a394-f0ad5efe2397" containerName="mariadb-account-create-update" Feb 21 22:02:01 crc kubenswrapper[4717]: E0221 22:02:01.718566 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8357efe6-d264-4a56-902a-b7e443b93ac7" containerName="mariadb-account-create-update" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.718625 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8357efe6-d264-4a56-902a-b7e443b93ac7" containerName="mariadb-account-create-update" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.718838 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fd9592-8cf7-4fda-a394-f0ad5efe2397" containerName="mariadb-account-create-update" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.718933 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8357efe6-d264-4a56-902a-b7e443b93ac7" containerName="mariadb-account-create-update" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.719562 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.722949 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-pp2fz" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.723549 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.732039 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4ttdv"] Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.787612 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-config-data\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.787667 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-combined-ca-bundle\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.787761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz64r\" (UniqueName: \"kubernetes.io/projected/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-kube-api-access-jz64r\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.787789 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-db-sync-config-data\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.888958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-config-data\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.889017 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-combined-ca-bundle\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.889061 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz64r\" (UniqueName: \"kubernetes.io/projected/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-kube-api-access-jz64r\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.889082 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-db-sync-config-data\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.896481 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-db-sync-config-data\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.896686 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-combined-ca-bundle\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.897119 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-config-data\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.908600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz64r\" (UniqueName: \"kubernetes.io/projected/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-kube-api-access-jz64r\") pod \"glance-db-sync-4ttdv\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.929334 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-828xd-config-6ppd4"] Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.935361 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.943796 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 21 22:02:01 crc kubenswrapper[4717]: I0221 22:02:01.958380 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-828xd-config-6ppd4"] Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.041755 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.093668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k9sl\" (UniqueName: \"kubernetes.io/projected/0a498c86-7b55-49ae-800d-953903785d34-kube-api-access-4k9sl\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.093715 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-scripts\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.093742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run-ovn\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.093952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-log-ovn\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.094064 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-additional-scripts\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.094250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.170327 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"2b857a63134bb9ae80cf3d1159ac5acba5db2486442a1c1a5bbe3f4915c373cf"} Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.176243 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"784818eab1f9eaa35a46778bf6e0181151ec641f940c806197a512169c1b2f4d"} Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.176290 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"c9f864746ba636e0bd5276c057c1e8df6f3f84158fabaa98790d9e01465fbd97"} Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.176305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"a83cc110f19b7f82ee65092594ef6b648346d09929db6d95f466b8e9667e2e09"} Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.196640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-additional-scripts\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.196736 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.196793 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k9sl\" (UniqueName: \"kubernetes.io/projected/0a498c86-7b55-49ae-800d-953903785d34-kube-api-access-4k9sl\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.196816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-scripts\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.196847 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run-ovn\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.196911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-log-ovn\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.197196 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-log-ovn\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.197252 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.197494 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-additional-scripts\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.197599 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run-ovn\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.199159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-scripts\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.216594 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k9sl\" (UniqueName: \"kubernetes.io/projected/0a498c86-7b55-49ae-800d-953903785d34-kube-api-access-4k9sl\") pod \"ovn-controller-828xd-config-6ppd4\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.262215 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.646230 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4ttdv"] Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.711580 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.746393 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-828xd-config-6ppd4"] Feb 21 22:02:02 crc kubenswrapper[4717]: W0221 22:02:02.758996 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a498c86_7b55_49ae_800d_953903785d34.slice/crio-98f575429210fa0898e2f474e3e104b742bbc0686435a66369e167c4bbba38bc WatchSource:0}: Error finding container 98f575429210fa0898e2f474e3e104b742bbc0686435a66369e167c4bbba38bc: Status 404 returned error can't find the container with id 98f575429210fa0898e2f474e3e104b742bbc0686435a66369e167c4bbba38bc Feb 21 22:02:02 crc kubenswrapper[4717]: I0221 22:02:02.779365 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:02:03 crc kubenswrapper[4717]: I0221 22:02:03.174489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd-config-6ppd4" event={"ID":"0a498c86-7b55-49ae-800d-953903785d34","Type":"ContainerStarted","Data":"98f575429210fa0898e2f474e3e104b742bbc0686435a66369e167c4bbba38bc"} Feb 21 22:02:03 crc kubenswrapper[4717]: I0221 22:02:03.176207 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ttdv" event={"ID":"7995c515-84a0-44d3-82e8-99a2ab1fb7b2","Type":"ContainerStarted","Data":"200e746d0a89678f59093fbd1064fc825984321f59efc9c5820768362e3096f3"} Feb 21 22:02:03 crc kubenswrapper[4717]: I0221 22:02:03.426685 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ncbrn"] Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.190798 4717 generic.go:334] "Generic (PLEG): container finished" podID="0a498c86-7b55-49ae-800d-953903785d34" containerID="79ee48e189a50ffdf5b61c9e64ea24fcde170e6bce71673b585b23d8eb26926a" exitCode=0 Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.190901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd-config-6ppd4" event={"ID":"0a498c86-7b55-49ae-800d-953903785d34","Type":"ContainerDied","Data":"79ee48e189a50ffdf5b61c9e64ea24fcde170e6bce71673b585b23d8eb26926a"} Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.201055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"197ff4715dc431e3f4ea52a258532b32b1a855caa3ed23d605a68c7618898bd6"} Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.201208 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ncbrn" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="registry-server" containerID="cri-o://590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1" gracePeriod=2 Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.620985 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.749661 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-catalog-content\") pod \"6bed5ac2-5539-4722-9432-1ee6618783bc\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.749786 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-utilities\") pod \"6bed5ac2-5539-4722-9432-1ee6618783bc\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.749921 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2rn5\" (UniqueName: \"kubernetes.io/projected/6bed5ac2-5539-4722-9432-1ee6618783bc-kube-api-access-m2rn5\") pod \"6bed5ac2-5539-4722-9432-1ee6618783bc\" (UID: \"6bed5ac2-5539-4722-9432-1ee6618783bc\") " Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.755754 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bed5ac2-5539-4722-9432-1ee6618783bc-kube-api-access-m2rn5" (OuterVolumeSpecName: "kube-api-access-m2rn5") pod "6bed5ac2-5539-4722-9432-1ee6618783bc" (UID: "6bed5ac2-5539-4722-9432-1ee6618783bc"). InnerVolumeSpecName "kube-api-access-m2rn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.765007 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-utilities" (OuterVolumeSpecName: "utilities") pod "6bed5ac2-5539-4722-9432-1ee6618783bc" (UID: "6bed5ac2-5539-4722-9432-1ee6618783bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.851265 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.851522 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2rn5\" (UniqueName: \"kubernetes.io/projected/6bed5ac2-5539-4722-9432-1ee6618783bc-kube-api-access-m2rn5\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.908402 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bed5ac2-5539-4722-9432-1ee6618783bc" (UID: "6bed5ac2-5539-4722-9432-1ee6618783bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:04 crc kubenswrapper[4717]: I0221 22:02:04.953792 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bed5ac2-5539-4722-9432-1ee6618783bc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.218557 4717 generic.go:334] "Generic (PLEG): container finished" podID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerID="590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1" exitCode=0 Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.218615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncbrn" event={"ID":"6bed5ac2-5539-4722-9432-1ee6618783bc","Type":"ContainerDied","Data":"590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.218643 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncbrn" event={"ID":"6bed5ac2-5539-4722-9432-1ee6618783bc","Type":"ContainerDied","Data":"d3dc1b2459edc31ec4473237925f910ca6be275326ce32b7fccb45735da21c64"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.218660 4717 scope.go:117] "RemoveContainer" containerID="590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.218773 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncbrn" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.253805 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"e413ddf4ad1833bb98d74eb4521cc5ff0785750360d764eb37e215e4ff6f1fad"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.253846 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"c6b9c9a455fc02acfa99efdb6e6783d1c74db3e98837768bee5329bce92fcf79"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.253869 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"a66b94a873090ab1c92c81c4d73e4d4a83b1f182fa9f73cd6566d76b568a4090"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.253878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"7659059e43e9d9f12685702bfedea398c5d7244f9adffbe1802b2969ae8b0926"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.253886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"60941441d7473c87fe0b43a7b4a1872f334f04a47cb1611ed6470d81da5f0380"} Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.255343 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ncbrn"] Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.272606 4717 scope.go:117] "RemoveContainer" containerID="f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.272981 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ncbrn"] Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.299701 4717 scope.go:117] "RemoveContainer" containerID="28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.323700 4717 scope.go:117] "RemoveContainer" containerID="590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1" Feb 21 22:02:05 crc kubenswrapper[4717]: E0221 22:02:05.324070 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1\": container with ID starting with 590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1 not found: ID does not exist" containerID="590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.324122 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1"} err="failed to get container status \"590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1\": rpc error: code = NotFound desc = could not find container \"590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1\": container with ID starting with 590f6884cf91ebac7fbb27743746c694c7ab71f20fd19c685b6291c8ca613dd1 not found: ID does not exist" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.324149 4717 scope.go:117] "RemoveContainer" containerID="f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54" Feb 21 22:02:05 crc kubenswrapper[4717]: E0221 22:02:05.325383 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54\": container with ID starting with f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54 not found: ID does not exist" containerID="f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.325409 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54"} err="failed to get container status \"f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54\": rpc error: code = NotFound desc = could not find container \"f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54\": container with ID starting with f731597efa673138b5ba33082c0fa6f1b56f39d5c6906756a40fbc1cf9ff1c54 not found: ID does not exist" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.325424 4717 scope.go:117] "RemoveContainer" containerID="28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22" Feb 21 22:02:05 crc kubenswrapper[4717]: E0221 22:02:05.326042 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22\": container with ID starting with 28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22 not found: ID does not exist" containerID="28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.326068 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22"} err="failed to get container status \"28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22\": rpc error: code = NotFound desc = could not find container \"28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22\": container with ID starting with 28b10e04c83c13b9bc1b243752bedad3335cc8f976511743d3d260e52194bf22 not found: ID does not exist" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.554179 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.663708 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k9sl\" (UniqueName: \"kubernetes.io/projected/0a498c86-7b55-49ae-800d-953903785d34-kube-api-access-4k9sl\") pod \"0a498c86-7b55-49ae-800d-953903785d34\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.663817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-log-ovn\") pod \"0a498c86-7b55-49ae-800d-953903785d34\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.663844 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run-ovn\") pod \"0a498c86-7b55-49ae-800d-953903785d34\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.663932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-additional-scripts\") pod \"0a498c86-7b55-49ae-800d-953903785d34\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.663967 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run\") pod \"0a498c86-7b55-49ae-800d-953903785d34\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.663954 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0a498c86-7b55-49ae-800d-953903785d34" (UID: "0a498c86-7b55-49ae-800d-953903785d34"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664011 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0a498c86-7b55-49ae-800d-953903785d34" (UID: "0a498c86-7b55-49ae-800d-953903785d34"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-scripts\") pod \"0a498c86-7b55-49ae-800d-953903785d34\" (UID: \"0a498c86-7b55-49ae-800d-953903785d34\") " Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664302 4717 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664318 4717 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664330 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run" (OuterVolumeSpecName: "var-run") pod "0a498c86-7b55-49ae-800d-953903785d34" (UID: "0a498c86-7b55-49ae-800d-953903785d34"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664481 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0a498c86-7b55-49ae-800d-953903785d34" (UID: "0a498c86-7b55-49ae-800d-953903785d34"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.664975 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-scripts" (OuterVolumeSpecName: "scripts") pod "0a498c86-7b55-49ae-800d-953903785d34" (UID: "0a498c86-7b55-49ae-800d-953903785d34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.668418 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a498c86-7b55-49ae-800d-953903785d34-kube-api-access-4k9sl" (OuterVolumeSpecName: "kube-api-access-4k9sl") pod "0a498c86-7b55-49ae-800d-953903785d34" (UID: "0a498c86-7b55-49ae-800d-953903785d34"). InnerVolumeSpecName "kube-api-access-4k9sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.770452 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.770495 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k9sl\" (UniqueName: \"kubernetes.io/projected/0a498c86-7b55-49ae-800d-953903785d34-kube-api-access-4k9sl\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.770508 4717 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a498c86-7b55-49ae-800d-953903785d34-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.770518 4717 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a498c86-7b55-49ae-800d-953903785d34-var-run\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.844018 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.844059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.884583 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:02:05 crc kubenswrapper[4717]: I0221 22:02:05.988093 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" path="/var/lib/kubelet/pods/6bed5ac2-5539-4722-9432-1ee6618783bc/volumes" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.280096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1fc309c6-44f4-4daf-90fa-6bf6845f195d","Type":"ContainerStarted","Data":"ad52100ac659cc1a9e5ef09cd9f877d400372d29f7d2047278e9b6cfca46916d"} Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.282936 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-6ppd4" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.282996 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd-config-6ppd4" event={"ID":"0a498c86-7b55-49ae-800d-953903785d34","Type":"ContainerDied","Data":"98f575429210fa0898e2f474e3e104b742bbc0686435a66369e167c4bbba38bc"} Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.283028 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f575429210fa0898e2f474e3e104b742bbc0686435a66369e167c4bbba38bc" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.334204 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.717695081 podStartE2EDuration="28.334186112s" podCreationTimestamp="2026-02-21 22:01:38 +0000 UTC" firstStartedPulling="2026-02-21 22:01:56.132881719 +0000 UTC m=+930.914415341" lastFinishedPulling="2026-02-21 22:02:03.74937271 +0000 UTC m=+938.530906372" observedRunningTime="2026-02-21 22:02:06.330676107 +0000 UTC m=+941.112209729" watchObservedRunningTime="2026-02-21 22:02:06.334186112 +0000 UTC m=+941.115719734" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.349114 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.617982 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-djqlv"] Feb 21 22:02:06 crc kubenswrapper[4717]: E0221 22:02:06.618308 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a498c86-7b55-49ae-800d-953903785d34" containerName="ovn-config" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.618327 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a498c86-7b55-49ae-800d-953903785d34" containerName="ovn-config" Feb 21 22:02:06 crc kubenswrapper[4717]: E0221 22:02:06.618346 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="extract-utilities" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.618353 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="extract-utilities" Feb 21 22:02:06 crc kubenswrapper[4717]: E0221 22:02:06.618367 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="extract-content" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.618373 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="extract-content" Feb 21 22:02:06 crc kubenswrapper[4717]: E0221 22:02:06.618382 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="registry-server" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.618390 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="registry-server" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.618572 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a498c86-7b55-49ae-800d-953903785d34" containerName="ovn-config" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.618598 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bed5ac2-5539-4722-9432-1ee6618783bc" containerName="registry-server" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.619364 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.625009 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.637531 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-djqlv"] Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.665292 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-828xd-config-6ppd4"] Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.672356 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-828xd-config-6ppd4"] Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.684127 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-828xd" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.684911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-config\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.684958 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.684999 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.685035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbbc\" (UniqueName: \"kubernetes.io/projected/0351efa1-d14f-468d-ad6c-ea432ef629ba-kube-api-access-bnbbc\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.685057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.685179 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.766715 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-828xd-config-8xx4w"] Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.767896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.769669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.778663 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-828xd-config-8xx4w"] Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.786510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.786637 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-config\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.786675 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.786699 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.788019 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.793050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbbc\" (UniqueName: \"kubernetes.io/projected/0351efa1-d14f-468d-ad6c-ea432ef629ba-kube-api-access-bnbbc\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.793134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.793654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.794010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.794319 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.794688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-config\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.828169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbbc\" (UniqueName: \"kubernetes.io/projected/0351efa1-d14f-468d-ad6c-ea432ef629ba-kube-api-access-bnbbc\") pod \"dnsmasq-dns-77585f5f8c-djqlv\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.895288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run-ovn\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.895367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.895460 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-additional-scripts\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.895489 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-log-ovn\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.895570 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5xp\" (UniqueName: \"kubernetes.io/projected/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-kube-api-access-6n5xp\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.895745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-scripts\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.941875 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-additional-scripts\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997199 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-log-ovn\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997235 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5xp\" (UniqueName: \"kubernetes.io/projected/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-kube-api-access-6n5xp\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997273 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-scripts\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run-ovn\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997329 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-log-ovn\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.997750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run-ovn\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:06 crc kubenswrapper[4717]: I0221 22:02:06.998024 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-additional-scripts\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.006830 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-scripts\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.025456 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5xp\" (UniqueName: \"kubernetes.io/projected/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-kube-api-access-6n5xp\") pod \"ovn-controller-828xd-config-8xx4w\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.085411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.354161 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-djqlv"] Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.627698 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-828xd-config-8xx4w"] Feb 21 22:02:07 crc kubenswrapper[4717]: W0221 22:02:07.664822 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14227aa2_2ab6_4397_bc31_7f5c9767ccb9.slice/crio-31c8387ee11b60a329973422c6af9969ec35132cf35fb734f75844b93a3ae302 WatchSource:0}: Error finding container 31c8387ee11b60a329973422c6af9969ec35132cf35fb734f75844b93a3ae302: Status 404 returned error can't find the container with id 31c8387ee11b60a329973422c6af9969ec35132cf35fb734f75844b93a3ae302 Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.824738 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thlb8"] Feb 21 22:02:07 crc kubenswrapper[4717]: I0221 22:02:07.986314 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a498c86-7b55-49ae-800d-953903785d34" path="/var/lib/kubelet/pods/0a498c86-7b55-49ae-800d-953903785d34/volumes" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.305853 4717 generic.go:334] "Generic (PLEG): container finished" podID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerID="aa595f9a91865784c26f785813687011b447429c09f9e83f3f4a8ea06d205916" exitCode=0 Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.305931 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" event={"ID":"0351efa1-d14f-468d-ad6c-ea432ef629ba","Type":"ContainerDied","Data":"aa595f9a91865784c26f785813687011b447429c09f9e83f3f4a8ea06d205916"} Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.305957 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" event={"ID":"0351efa1-d14f-468d-ad6c-ea432ef629ba","Type":"ContainerStarted","Data":"4cc2c46d9884013fd156416f731d09efead3e20c3eef2be5a763e2ccc1d4e773"} Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.308123 4717 generic.go:334] "Generic (PLEG): container finished" podID="14227aa2-2ab6-4397-bc31-7f5c9767ccb9" containerID="78ed7a4e15a63c17670c1a65ac66fd9b7cf834115de5ac9b2ebdf64a6319cfb5" exitCode=0 Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.308183 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd-config-8xx4w" event={"ID":"14227aa2-2ab6-4397-bc31-7f5c9767ccb9","Type":"ContainerDied","Data":"78ed7a4e15a63c17670c1a65ac66fd9b7cf834115de5ac9b2ebdf64a6319cfb5"} Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.308207 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd-config-8xx4w" event={"ID":"14227aa2-2ab6-4397-bc31-7f5c9767ccb9","Type":"ContainerStarted","Data":"31c8387ee11b60a329973422c6af9969ec35132cf35fb734f75844b93a3ae302"} Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.308493 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-thlb8" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="registry-server" containerID="cri-o://2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351" gracePeriod=2 Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.711748 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.835388 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl86r\" (UniqueName: \"kubernetes.io/projected/e714f3eb-3dd6-4325-a30c-87fd3730116f-kube-api-access-sl86r\") pod \"e714f3eb-3dd6-4325-a30c-87fd3730116f\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.835484 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-catalog-content\") pod \"e714f3eb-3dd6-4325-a30c-87fd3730116f\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.835575 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-utilities\") pod \"e714f3eb-3dd6-4325-a30c-87fd3730116f\" (UID: \"e714f3eb-3dd6-4325-a30c-87fd3730116f\") " Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.839111 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-utilities" (OuterVolumeSpecName: "utilities") pod "e714f3eb-3dd6-4325-a30c-87fd3730116f" (UID: "e714f3eb-3dd6-4325-a30c-87fd3730116f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.841772 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e714f3eb-3dd6-4325-a30c-87fd3730116f-kube-api-access-sl86r" (OuterVolumeSpecName: "kube-api-access-sl86r") pod "e714f3eb-3dd6-4325-a30c-87fd3730116f" (UID: "e714f3eb-3dd6-4325-a30c-87fd3730116f"). InnerVolumeSpecName "kube-api-access-sl86r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.908943 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e714f3eb-3dd6-4325-a30c-87fd3730116f" (UID: "e714f3eb-3dd6-4325-a30c-87fd3730116f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.937462 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl86r\" (UniqueName: \"kubernetes.io/projected/e714f3eb-3dd6-4325-a30c-87fd3730116f-kube-api-access-sl86r\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.937498 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:08 crc kubenswrapper[4717]: I0221 22:02:08.937510 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e714f3eb-3dd6-4325-a30c-87fd3730116f-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.062911 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.063224 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.063269 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.063960 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d284cad9372c32723c3911aa224f8fd37b88ced35957297bd0664e6eabafd92"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.064012 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://2d284cad9372c32723c3911aa224f8fd37b88ced35957297bd0664e6eabafd92" gracePeriod=600 Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.320193 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="2d284cad9372c32723c3911aa224f8fd37b88ced35957297bd0664e6eabafd92" exitCode=0 Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.320258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"2d284cad9372c32723c3911aa224f8fd37b88ced35957297bd0664e6eabafd92"} Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.320288 4717 scope.go:117] "RemoveContainer" containerID="38f86864c8d1bb2ef635ae7b8573c0d40328b4d39ce0f3640268f93045f23c56" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.323489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" event={"ID":"0351efa1-d14f-468d-ad6c-ea432ef629ba","Type":"ContainerStarted","Data":"7f2e64ff7f3927feef67353f92620a3d4ae8bbdf80e7870369ab49516feb1b1c"} Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.323591 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.326596 4717 generic.go:334] "Generic (PLEG): container finished" podID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerID="2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351" exitCode=0 Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.326645 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thlb8" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.326696 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thlb8" event={"ID":"e714f3eb-3dd6-4325-a30c-87fd3730116f","Type":"ContainerDied","Data":"2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351"} Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.326755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thlb8" event={"ID":"e714f3eb-3dd6-4325-a30c-87fd3730116f","Type":"ContainerDied","Data":"7c4437589b2f49754810bb88543a6c9f4cfb1c7362cd97f62727420e6a399eab"} Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.343981 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" podStartSLOduration=3.343966408 podStartE2EDuration="3.343966408s" podCreationTimestamp="2026-02-21 22:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:09.343150908 +0000 UTC m=+944.124684560" watchObservedRunningTime="2026-02-21 22:02:09.343966408 +0000 UTC m=+944.125500030" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.364281 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thlb8"] Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.372621 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-thlb8"] Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.380549 4717 scope.go:117] "RemoveContainer" containerID="2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.396524 4717 scope.go:117] "RemoveContainer" containerID="8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.423360 4717 scope.go:117] "RemoveContainer" containerID="235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.460248 4717 scope.go:117] "RemoveContainer" containerID="2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351" Feb 21 22:02:09 crc kubenswrapper[4717]: E0221 22:02:09.460767 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351\": container with ID starting with 2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351 not found: ID does not exist" containerID="2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.460796 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351"} err="failed to get container status \"2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351\": rpc error: code = NotFound desc = could not find container \"2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351\": container with ID starting with 2cb5645143ff07def7d58c733de21c7f157f6a58f05f84e2f3468f75b5df5351 not found: ID does not exist" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.460819 4717 scope.go:117] "RemoveContainer" containerID="8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29" Feb 21 22:02:09 crc kubenswrapper[4717]: E0221 22:02:09.461152 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29\": container with ID starting with 8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29 not found: ID does not exist" containerID="8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.461171 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29"} err="failed to get container status \"8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29\": rpc error: code = NotFound desc = could not find container \"8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29\": container with ID starting with 8342c394f94c456be7e5a92ee8a00787dee3785b94dbd444e666aff50596ff29 not found: ID does not exist" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.461183 4717 scope.go:117] "RemoveContainer" containerID="235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018" Feb 21 22:02:09 crc kubenswrapper[4717]: E0221 22:02:09.461372 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018\": container with ID starting with 235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018 not found: ID does not exist" containerID="235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.461391 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018"} err="failed to get container status \"235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018\": rpc error: code = NotFound desc = could not find container \"235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018\": container with ID starting with 235b9a7b589a3537129d5c87f3084c558a59e550fc0a620b92598a924b6cb018 not found: ID does not exist" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.489714 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.600475 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.652568 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-log-ovn\") pod \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.652695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n5xp\" (UniqueName: \"kubernetes.io/projected/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-kube-api-access-6n5xp\") pod \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.652735 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run\") pod \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.652763 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run-ovn\") pod \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.652876 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-additional-scripts\") pod \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.652945 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-scripts\") pod \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\" (UID: \"14227aa2-2ab6-4397-bc31-7f5c9767ccb9\") " Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.654647 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "14227aa2-2ab6-4397-bc31-7f5c9767ccb9" (UID: "14227aa2-2ab6-4397-bc31-7f5c9767ccb9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.654647 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run" (OuterVolumeSpecName: "var-run") pod "14227aa2-2ab6-4397-bc31-7f5c9767ccb9" (UID: "14227aa2-2ab6-4397-bc31-7f5c9767ccb9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.654713 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "14227aa2-2ab6-4397-bc31-7f5c9767ccb9" (UID: "14227aa2-2ab6-4397-bc31-7f5c9767ccb9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.655296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "14227aa2-2ab6-4397-bc31-7f5c9767ccb9" (UID: "14227aa2-2ab6-4397-bc31-7f5c9767ccb9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.658299 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-scripts" (OuterVolumeSpecName: "scripts") pod "14227aa2-2ab6-4397-bc31-7f5c9767ccb9" (UID: "14227aa2-2ab6-4397-bc31-7f5c9767ccb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.663516 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-kube-api-access-6n5xp" (OuterVolumeSpecName: "kube-api-access-6n5xp") pod "14227aa2-2ab6-4397-bc31-7f5c9767ccb9" (UID: "14227aa2-2ab6-4397-bc31-7f5c9767ccb9"). InnerVolumeSpecName "kube-api-access-6n5xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.754277 4717 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.754310 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n5xp\" (UniqueName: \"kubernetes.io/projected/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-kube-api-access-6n5xp\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.754328 4717 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.754341 4717 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.754349 4717 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.754357 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14227aa2-2ab6-4397-bc31-7f5c9767ccb9-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:09 crc kubenswrapper[4717]: I0221 22:02:09.991739 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" path="/var/lib/kubelet/pods/e714f3eb-3dd6-4325-a30c-87fd3730116f/volumes" Feb 21 22:02:10 crc kubenswrapper[4717]: I0221 22:02:10.341714 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-828xd-config-8xx4w" event={"ID":"14227aa2-2ab6-4397-bc31-7f5c9767ccb9","Type":"ContainerDied","Data":"31c8387ee11b60a329973422c6af9969ec35132cf35fb734f75844b93a3ae302"} Feb 21 22:02:10 crc kubenswrapper[4717]: I0221 22:02:10.341752 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31c8387ee11b60a329973422c6af9969ec35132cf35fb734f75844b93a3ae302" Feb 21 22:02:10 crc kubenswrapper[4717]: I0221 22:02:10.341750 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-828xd-config-8xx4w" Feb 21 22:02:10 crc kubenswrapper[4717]: I0221 22:02:10.345401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"d6ea5ddcf698b572f76b6bdcff7985d26f0ef62fef8084d68925d625b747dd34"} Feb 21 22:02:10 crc kubenswrapper[4717]: I0221 22:02:10.688217 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-828xd-config-8xx4w"] Feb 21 22:02:10 crc kubenswrapper[4717]: I0221 22:02:10.697646 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-828xd-config-8xx4w"] Feb 21 22:02:11 crc kubenswrapper[4717]: I0221 22:02:11.838258 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqq7s"] Feb 21 22:02:11 crc kubenswrapper[4717]: I0221 22:02:11.840287 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nqq7s" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="registry-server" containerID="cri-o://68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4" gracePeriod=2 Feb 21 22:02:11 crc kubenswrapper[4717]: I0221 22:02:11.992100 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14227aa2-2ab6-4397-bc31-7f5c9767ccb9" path="/var/lib/kubelet/pods/14227aa2-2ab6-4397-bc31-7f5c9767ccb9/volumes" Feb 21 22:02:12 crc kubenswrapper[4717]: I0221 22:02:12.377678 4717 generic.go:334] "Generic (PLEG): container finished" podID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerID="68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4" exitCode=0 Feb 21 22:02:12 crc kubenswrapper[4717]: I0221 22:02:12.378041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerDied","Data":"68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4"} Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.061081 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.216076 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.410474 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vwk86"] Feb 21 22:02:13 crc kubenswrapper[4717]: E0221 22:02:13.410783 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="extract-utilities" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.410799 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="extract-utilities" Feb 21 22:02:13 crc kubenswrapper[4717]: E0221 22:02:13.410820 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="registry-server" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.410827 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="registry-server" Feb 21 22:02:13 crc kubenswrapper[4717]: E0221 22:02:13.410836 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="extract-content" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.410842 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="extract-content" Feb 21 22:02:13 crc kubenswrapper[4717]: E0221 22:02:13.410851 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14227aa2-2ab6-4397-bc31-7f5c9767ccb9" containerName="ovn-config" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.410856 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="14227aa2-2ab6-4397-bc31-7f5c9767ccb9" containerName="ovn-config" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.411085 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="14227aa2-2ab6-4397-bc31-7f5c9767ccb9" containerName="ovn-config" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.411098 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e714f3eb-3dd6-4325-a30c-87fd3730116f" containerName="registry-server" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.411536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.427683 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vwk86"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.516603 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff13878c-357b-4a39-8b6b-f4f6e1929fed-operator-scripts\") pod \"cinder-db-create-vwk86\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.516696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4w8h\" (UniqueName: \"kubernetes.io/projected/ff13878c-357b-4a39-8b6b-f4f6e1929fed-kube-api-access-r4w8h\") pod \"cinder-db-create-vwk86\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.543909 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e2b5-account-create-update-bq26b"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.552136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.554613 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.576471 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e2b5-account-create-update-bq26b"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.618023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7dda56c-0b65-46d4-9f88-dda7c73423da-operator-scripts\") pod \"cinder-e2b5-account-create-update-bq26b\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.618078 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff13878c-357b-4a39-8b6b-f4f6e1929fed-operator-scripts\") pod \"cinder-db-create-vwk86\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.618104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4w8h\" (UniqueName: \"kubernetes.io/projected/ff13878c-357b-4a39-8b6b-f4f6e1929fed-kube-api-access-r4w8h\") pod \"cinder-db-create-vwk86\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.618136 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xp8j\" (UniqueName: \"kubernetes.io/projected/b7dda56c-0b65-46d4-9f88-dda7c73423da-kube-api-access-4xp8j\") pod \"cinder-e2b5-account-create-update-bq26b\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.618924 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff13878c-357b-4a39-8b6b-f4f6e1929fed-operator-scripts\") pod \"cinder-db-create-vwk86\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.636042 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-g8gpn"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.644987 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.653926 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g8gpn"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.665742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4w8h\" (UniqueName: \"kubernetes.io/projected/ff13878c-357b-4a39-8b6b-f4f6e1929fed-kube-api-access-r4w8h\") pod \"cinder-db-create-vwk86\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.698912 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7m8mk"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.699853 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.704218 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.704387 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.704511 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lfvb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.706496 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.715979 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7m8mk"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.719447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-config-data\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.719555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5km2l\" (UniqueName: \"kubernetes.io/projected/99b03743-48fc-4006-8cfd-b912deba0232-kube-api-access-5km2l\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.719683 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7dda56c-0b65-46d4-9f88-dda7c73423da-operator-scripts\") pod \"cinder-e2b5-account-create-update-bq26b\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.719769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-combined-ca-bundle\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.719872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xp8j\" (UniqueName: \"kubernetes.io/projected/b7dda56c-0b65-46d4-9f88-dda7c73423da-kube-api-access-4xp8j\") pod \"cinder-e2b5-account-create-update-bq26b\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.720783 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7dda56c-0b65-46d4-9f88-dda7c73423da-operator-scripts\") pod \"cinder-e2b5-account-create-update-bq26b\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.731481 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.739470 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hmzb4"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.740464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.745839 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d9ae-account-create-update-69mhh"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.746843 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.751167 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.776357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xp8j\" (UniqueName: \"kubernetes.io/projected/b7dda56c-0b65-46d4-9f88-dda7c73423da-kube-api-access-4xp8j\") pod \"cinder-e2b5-account-create-update-bq26b\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.790928 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hmzb4"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.796360 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d9ae-account-create-update-69mhh"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821101 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz256\" (UniqueName: \"kubernetes.io/projected/b1e4b601-cb61-433b-8e06-cbd920071fc5-kube-api-access-fz256\") pod \"barbican-db-create-g8gpn\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821155 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-combined-ca-bundle\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821178 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/040a8750-e237-47d5-8b7b-8f310c436b87-operator-scripts\") pod \"barbican-d9ae-account-create-update-69mhh\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821202 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86a464ef-0333-4c29-a6a7-8af81a592e0b-operator-scripts\") pod \"neutron-db-create-hmzb4\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821243 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-config-data\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5km2l\" (UniqueName: \"kubernetes.io/projected/99b03743-48fc-4006-8cfd-b912deba0232-kube-api-access-5km2l\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821289 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762fv\" (UniqueName: \"kubernetes.io/projected/86a464ef-0333-4c29-a6a7-8af81a592e0b-kube-api-access-762fv\") pod \"neutron-db-create-hmzb4\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79jwt\" (UniqueName: \"kubernetes.io/projected/040a8750-e237-47d5-8b7b-8f310c436b87-kube-api-access-79jwt\") pod \"barbican-d9ae-account-create-update-69mhh\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.821350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4b601-cb61-433b-8e06-cbd920071fc5-operator-scripts\") pod \"barbican-db-create-g8gpn\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.837512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-combined-ca-bundle\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.838374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-config-data\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.864165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5km2l\" (UniqueName: \"kubernetes.io/projected/99b03743-48fc-4006-8cfd-b912deba0232-kube-api-access-5km2l\") pod \"keystone-db-sync-7m8mk\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.874443 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b660-account-create-update-5brnn"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.875369 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.883360 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.894326 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b660-account-create-update-5brnn"] Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4b601-cb61-433b-8e06-cbd920071fc5-operator-scripts\") pod \"barbican-db-create-g8gpn\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922335 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz256\" (UniqueName: \"kubernetes.io/projected/b1e4b601-cb61-433b-8e06-cbd920071fc5-kube-api-access-fz256\") pod \"barbican-db-create-g8gpn\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922426 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd80c486-9397-43f2-ba5e-c6f868a2a47a-operator-scripts\") pod \"neutron-b660-account-create-update-5brnn\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/040a8750-e237-47d5-8b7b-8f310c436b87-operator-scripts\") pod \"barbican-d9ae-account-create-update-69mhh\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86a464ef-0333-4c29-a6a7-8af81a592e0b-operator-scripts\") pod \"neutron-db-create-hmzb4\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922703 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cht4\" (UniqueName: \"kubernetes.io/projected/dd80c486-9397-43f2-ba5e-c6f868a2a47a-kube-api-access-2cht4\") pod \"neutron-b660-account-create-update-5brnn\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762fv\" (UniqueName: \"kubernetes.io/projected/86a464ef-0333-4c29-a6a7-8af81a592e0b-kube-api-access-762fv\") pod \"neutron-db-create-hmzb4\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922889 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79jwt\" (UniqueName: \"kubernetes.io/projected/040a8750-e237-47d5-8b7b-8f310c436b87-kube-api-access-79jwt\") pod \"barbican-d9ae-account-create-update-69mhh\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.922987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4b601-cb61-433b-8e06-cbd920071fc5-operator-scripts\") pod \"barbican-db-create-g8gpn\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.923461 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/040a8750-e237-47d5-8b7b-8f310c436b87-operator-scripts\") pod \"barbican-d9ae-account-create-update-69mhh\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.924474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86a464ef-0333-4c29-a6a7-8af81a592e0b-operator-scripts\") pod \"neutron-db-create-hmzb4\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.930355 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.954423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz256\" (UniqueName: \"kubernetes.io/projected/b1e4b601-cb61-433b-8e06-cbd920071fc5-kube-api-access-fz256\") pod \"barbican-db-create-g8gpn\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.964592 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762fv\" (UniqueName: \"kubernetes.io/projected/86a464ef-0333-4c29-a6a7-8af81a592e0b-kube-api-access-762fv\") pod \"neutron-db-create-hmzb4\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:13 crc kubenswrapper[4717]: I0221 22:02:13.968644 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79jwt\" (UniqueName: \"kubernetes.io/projected/040a8750-e237-47d5-8b7b-8f310c436b87-kube-api-access-79jwt\") pod \"barbican-d9ae-account-create-update-69mhh\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.006659 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.025039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd80c486-9397-43f2-ba5e-c6f868a2a47a-operator-scripts\") pod \"neutron-b660-account-create-update-5brnn\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.025075 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.025156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cht4\" (UniqueName: \"kubernetes.io/projected/dd80c486-9397-43f2-ba5e-c6f868a2a47a-kube-api-access-2cht4\") pod \"neutron-b660-account-create-update-5brnn\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.027242 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd80c486-9397-43f2-ba5e-c6f868a2a47a-operator-scripts\") pod \"neutron-b660-account-create-update-5brnn\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.045124 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cht4\" (UniqueName: \"kubernetes.io/projected/dd80c486-9397-43f2-ba5e-c6f868a2a47a-kube-api-access-2cht4\") pod \"neutron-b660-account-create-update-5brnn\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.058311 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.073990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:14 crc kubenswrapper[4717]: I0221 22:02:14.236316 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:16 crc kubenswrapper[4717]: I0221 22:02:16.942986 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:17 crc kubenswrapper[4717]: I0221 22:02:17.039763 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-v7s8s"] Feb 21 22:02:17 crc kubenswrapper[4717]: I0221 22:02:17.040067 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-v7s8s" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="dnsmasq-dns" containerID="cri-o://a6a3a5694ed3bec429f0997e51c6a13341eb16d4169e60fc261ba7937e0b5acf" gracePeriod=10 Feb 21 22:02:18 crc kubenswrapper[4717]: I0221 22:02:18.433490 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea234932-2730-4dfa-9e21-91bc7575a885" containerID="a6a3a5694ed3bec429f0997e51c6a13341eb16d4169e60fc261ba7937e0b5acf" exitCode=0 Feb 21 22:02:18 crc kubenswrapper[4717]: I0221 22:02:18.433558 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-v7s8s" event={"ID":"ea234932-2730-4dfa-9e21-91bc7575a885","Type":"ContainerDied","Data":"a6a3a5694ed3bec429f0997e51c6a13341eb16d4169e60fc261ba7937e0b5acf"} Feb 21 22:02:18 crc kubenswrapper[4717]: I0221 22:02:18.616140 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-v7s8s" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.030920 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.031109 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jz64r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-4ttdv_openstack(7995c515-84a0-44d3-82e8-99a2ab1fb7b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.032301 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-4ttdv" podUID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.392053 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.393973 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4 is running failed: container process not found" containerID="68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.394305 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4 is running failed: container process not found" containerID="68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.395104 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4 is running failed: container process not found" containerID="68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.395177 4717 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-nqq7s" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="registry-server" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.431107 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-nb\") pod \"ea234932-2730-4dfa-9e21-91bc7575a885\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.431184 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-dns-svc\") pod \"ea234932-2730-4dfa-9e21-91bc7575a885\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.431225 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcvk2\" (UniqueName: \"kubernetes.io/projected/ea234932-2730-4dfa-9e21-91bc7575a885-kube-api-access-tcvk2\") pod \"ea234932-2730-4dfa-9e21-91bc7575a885\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.431249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-config\") pod \"ea234932-2730-4dfa-9e21-91bc7575a885\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.431301 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-sb\") pod \"ea234932-2730-4dfa-9e21-91bc7575a885\" (UID: \"ea234932-2730-4dfa-9e21-91bc7575a885\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.437371 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea234932-2730-4dfa-9e21-91bc7575a885-kube-api-access-tcvk2" (OuterVolumeSpecName: "kube-api-access-tcvk2") pod "ea234932-2730-4dfa-9e21-91bc7575a885" (UID: "ea234932-2730-4dfa-9e21-91bc7575a885"). InnerVolumeSpecName "kube-api-access-tcvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.452365 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-v7s8s" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.452724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-v7s8s" event={"ID":"ea234932-2730-4dfa-9e21-91bc7575a885","Type":"ContainerDied","Data":"ec030064b7cac483acc7a051a8df35404d40eb1335338dbff253d68be60b9517"} Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.452837 4717 scope.go:117] "RemoveContainer" containerID="a6a3a5694ed3bec429f0997e51c6a13341eb16d4169e60fc261ba7937e0b5acf" Feb 21 22:02:19 crc kubenswrapper[4717]: E0221 22:02:19.456505 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-4ttdv" podUID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.479465 4717 scope.go:117] "RemoveContainer" containerID="a8ad14f330d1b3520f6f9087e70cfa2f8da47f1603e8986db1a7b3b9814e50ae" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.480203 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea234932-2730-4dfa-9e21-91bc7575a885" (UID: "ea234932-2730-4dfa-9e21-91bc7575a885"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.497082 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea234932-2730-4dfa-9e21-91bc7575a885" (UID: "ea234932-2730-4dfa-9e21-91bc7575a885"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.503278 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea234932-2730-4dfa-9e21-91bc7575a885" (UID: "ea234932-2730-4dfa-9e21-91bc7575a885"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.511364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-config" (OuterVolumeSpecName: "config") pod "ea234932-2730-4dfa-9e21-91bc7575a885" (UID: "ea234932-2730-4dfa-9e21-91bc7575a885"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.520134 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.533181 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.533202 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.533211 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.533219 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea234932-2730-4dfa-9e21-91bc7575a885-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.533228 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcvk2\" (UniqueName: \"kubernetes.io/projected/ea234932-2730-4dfa-9e21-91bc7575a885-kube-api-access-tcvk2\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.628548 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vwk86"] Feb 21 22:02:19 crc kubenswrapper[4717]: W0221 22:02:19.632692 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff13878c_357b_4a39_8b6b_f4f6e1929fed.slice/crio-c077af9db265446b4355064abdf393cc012d36f42192837f86347e1e7aff2982 WatchSource:0}: Error finding container c077af9db265446b4355064abdf393cc012d36f42192837f86347e1e7aff2982: Status 404 returned error can't find the container with id c077af9db265446b4355064abdf393cc012d36f42192837f86347e1e7aff2982 Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.634017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-catalog-content\") pod \"db87482a-6aa4-49f8-ac16-b2aa288196ff\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.634053 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fjfg\" (UniqueName: \"kubernetes.io/projected/db87482a-6aa4-49f8-ac16-b2aa288196ff-kube-api-access-8fjfg\") pod \"db87482a-6aa4-49f8-ac16-b2aa288196ff\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.634095 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-utilities\") pod \"db87482a-6aa4-49f8-ac16-b2aa288196ff\" (UID: \"db87482a-6aa4-49f8-ac16-b2aa288196ff\") " Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.635536 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-utilities" (OuterVolumeSpecName: "utilities") pod "db87482a-6aa4-49f8-ac16-b2aa288196ff" (UID: "db87482a-6aa4-49f8-ac16-b2aa288196ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.647470 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db87482a-6aa4-49f8-ac16-b2aa288196ff-kube-api-access-8fjfg" (OuterVolumeSpecName: "kube-api-access-8fjfg") pod "db87482a-6aa4-49f8-ac16-b2aa288196ff" (UID: "db87482a-6aa4-49f8-ac16-b2aa288196ff"). InnerVolumeSpecName "kube-api-access-8fjfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.670264 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db87482a-6aa4-49f8-ac16-b2aa288196ff" (UID: "db87482a-6aa4-49f8-ac16-b2aa288196ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:19 crc kubenswrapper[4717]: W0221 22:02:19.671840 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1e4b601_cb61_433b_8e06_cbd920071fc5.slice/crio-d7ddb5b7920d5c572ef6940c2306541c93e2ceda7fcfdabcf2f81d50e11cfb81 WatchSource:0}: Error finding container d7ddb5b7920d5c572ef6940c2306541c93e2ceda7fcfdabcf2f81d50e11cfb81: Status 404 returned error can't find the container with id d7ddb5b7920d5c572ef6940c2306541c93e2ceda7fcfdabcf2f81d50e11cfb81 Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.682221 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g8gpn"] Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.736730 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.736946 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fjfg\" (UniqueName: \"kubernetes.io/projected/db87482a-6aa4-49f8-ac16-b2aa288196ff-kube-api-access-8fjfg\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.737041 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db87482a-6aa4-49f8-ac16-b2aa288196ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.792908 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-v7s8s"] Feb 21 22:02:19 crc kubenswrapper[4717]: I0221 22:02:19.799811 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-v7s8s"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.007769 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" path="/var/lib/kubelet/pods/ea234932-2730-4dfa-9e21-91bc7575a885/volumes" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.056301 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d9ae-account-create-update-69mhh"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.067138 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hmzb4"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.072016 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b660-account-create-update-5brnn"] Feb 21 22:02:20 crc kubenswrapper[4717]: W0221 22:02:20.073153 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86a464ef_0333_4c29_a6a7_8af81a592e0b.slice/crio-387f69c61d10d438838c6f4fb3d061c18679864893ddeb5f24000b415c3aa77a WatchSource:0}: Error finding container 387f69c61d10d438838c6f4fb3d061c18679864893ddeb5f24000b415c3aa77a: Status 404 returned error can't find the container with id 387f69c61d10d438838c6f4fb3d061c18679864893ddeb5f24000b415c3aa77a Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.080707 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e2b5-account-create-update-bq26b"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.103311 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7m8mk"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.466961 4717 generic.go:334] "Generic (PLEG): container finished" podID="ff13878c-357b-4a39-8b6b-f4f6e1929fed" containerID="452e5f9e478d34e1dd34d9506c5322fba26c49d545d7c820c5e7dc01b782c588" exitCode=0 Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.467396 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vwk86" event={"ID":"ff13878c-357b-4a39-8b6b-f4f6e1929fed","Type":"ContainerDied","Data":"452e5f9e478d34e1dd34d9506c5322fba26c49d545d7c820c5e7dc01b782c588"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.467428 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vwk86" event={"ID":"ff13878c-357b-4a39-8b6b-f4f6e1929fed","Type":"ContainerStarted","Data":"c077af9db265446b4355064abdf393cc012d36f42192837f86347e1e7aff2982"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.477648 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e2b5-account-create-update-bq26b" event={"ID":"b7dda56c-0b65-46d4-9f88-dda7c73423da","Type":"ContainerStarted","Data":"9dd3c013e4aac797c1861804442abb4e6cc8f5f1159f868c819c8fe145c0ab71"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.486047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmzb4" event={"ID":"86a464ef-0333-4c29-a6a7-8af81a592e0b","Type":"ContainerStarted","Data":"e6c2a0752dc5328f8b1331059bc4c3578c5cc53cd20854e0fc6cf06f7e5ac335"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.486089 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmzb4" event={"ID":"86a464ef-0333-4c29-a6a7-8af81a592e0b","Type":"ContainerStarted","Data":"387f69c61d10d438838c6f4fb3d061c18679864893ddeb5f24000b415c3aa77a"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.489191 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqq7s" event={"ID":"db87482a-6aa4-49f8-ac16-b2aa288196ff","Type":"ContainerDied","Data":"a6a14e016e7107e99c670a5f051485d8434b9ff07d11cdd99bb602552408de43"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.489224 4717 scope.go:117] "RemoveContainer" containerID="68c4e431bf9dc3ae85e13aa06b69dcd05ee7e1260772e81eaab26777d05f29b4" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.489320 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqq7s" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.492568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7m8mk" event={"ID":"99b03743-48fc-4006-8cfd-b912deba0232","Type":"ContainerStarted","Data":"69d13a742739e189c4cc86873988b5be3e5ab7365128281169ea2fa991bac80e"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.494068 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d9ae-account-create-update-69mhh" event={"ID":"040a8750-e237-47d5-8b7b-8f310c436b87","Type":"ContainerStarted","Data":"e6017d24e06b47a37b702ff3ba519c26ee1a0787dfb9e993919e9417dd0de59b"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.499054 4717 generic.go:334] "Generic (PLEG): container finished" podID="b1e4b601-cb61-433b-8e06-cbd920071fc5" containerID="d030300c9eaa4bb2ebe31023b3dcacc5c3e7cd4b2f3bb3489184bc74fcda5bb6" exitCode=0 Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.499168 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g8gpn" event={"ID":"b1e4b601-cb61-433b-8e06-cbd920071fc5","Type":"ContainerDied","Data":"d030300c9eaa4bb2ebe31023b3dcacc5c3e7cd4b2f3bb3489184bc74fcda5bb6"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.499203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g8gpn" event={"ID":"b1e4b601-cb61-433b-8e06-cbd920071fc5","Type":"ContainerStarted","Data":"d7ddb5b7920d5c572ef6940c2306541c93e2ceda7fcfdabcf2f81d50e11cfb81"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.500649 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b660-account-create-update-5brnn" event={"ID":"dd80c486-9397-43f2-ba5e-c6f868a2a47a","Type":"ContainerStarted","Data":"72a54d9b718436cd6b90d1ab849559844b097c7473a6ee1dda7815ae40a6002b"} Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.517370 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-hmzb4" podStartSLOduration=7.517349436 podStartE2EDuration="7.517349436s" podCreationTimestamp="2026-02-21 22:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:20.507174471 +0000 UTC m=+955.288708103" watchObservedRunningTime="2026-02-21 22:02:20.517349436 +0000 UTC m=+955.298883068" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.527790 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b660-account-create-update-5brnn" podStartSLOduration=7.527770125 podStartE2EDuration="7.527770125s" podCreationTimestamp="2026-02-21 22:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:20.522501748 +0000 UTC m=+955.304035390" watchObservedRunningTime="2026-02-21 22:02:20.527770125 +0000 UTC m=+955.309303757" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.532328 4717 scope.go:117] "RemoveContainer" containerID="150c4eaeb07ccfee3fdcf7e736f2b0442837add1f56e04a5701a182c52de38c5" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.559819 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqq7s"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.567386 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqq7s"] Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.569806 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-d9ae-account-create-update-69mhh" podStartSLOduration=7.569790262 podStartE2EDuration="7.569790262s" podCreationTimestamp="2026-02-21 22:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:20.56136475 +0000 UTC m=+955.342898372" watchObservedRunningTime="2026-02-21 22:02:20.569790262 +0000 UTC m=+955.351323884" Feb 21 22:02:20 crc kubenswrapper[4717]: I0221 22:02:20.581781 4717 scope.go:117] "RemoveContainer" containerID="bd660dac1eafa5f13673af9c9a7cc1f785e4b4ee09500acb579efe9a4221db46" Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.509396 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd80c486-9397-43f2-ba5e-c6f868a2a47a" containerID="8405cf1ce90dabded9e1452bc6e06c60e776fad26ceb6f912e58175ac3b1c571" exitCode=0 Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.509666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b660-account-create-update-5brnn" event={"ID":"dd80c486-9397-43f2-ba5e-c6f868a2a47a","Type":"ContainerDied","Data":"8405cf1ce90dabded9e1452bc6e06c60e776fad26ceb6f912e58175ac3b1c571"} Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.516410 4717 generic.go:334] "Generic (PLEG): container finished" podID="b7dda56c-0b65-46d4-9f88-dda7c73423da" containerID="66f7b17be06f663f8dd046236b2bbd86a3a97c9fef06057e31fbd5121aa231a7" exitCode=0 Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.516465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e2b5-account-create-update-bq26b" event={"ID":"b7dda56c-0b65-46d4-9f88-dda7c73423da","Type":"ContainerDied","Data":"66f7b17be06f663f8dd046236b2bbd86a3a97c9fef06057e31fbd5121aa231a7"} Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.518500 4717 generic.go:334] "Generic (PLEG): container finished" podID="86a464ef-0333-4c29-a6a7-8af81a592e0b" containerID="e6c2a0752dc5328f8b1331059bc4c3578c5cc53cd20854e0fc6cf06f7e5ac335" exitCode=0 Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.518537 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmzb4" event={"ID":"86a464ef-0333-4c29-a6a7-8af81a592e0b","Type":"ContainerDied","Data":"e6c2a0752dc5328f8b1331059bc4c3578c5cc53cd20854e0fc6cf06f7e5ac335"} Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.524224 4717 generic.go:334] "Generic (PLEG): container finished" podID="040a8750-e237-47d5-8b7b-8f310c436b87" containerID="75cf48369ea01327d45e0794cbeacde6407dba313b0557a0f9009b1ce4f0b902" exitCode=0 Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.524796 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d9ae-account-create-update-69mhh" event={"ID":"040a8750-e237-47d5-8b7b-8f310c436b87","Type":"ContainerDied","Data":"75cf48369ea01327d45e0794cbeacde6407dba313b0557a0f9009b1ce4f0b902"} Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.937944 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:21 crc kubenswrapper[4717]: I0221 22:02:21.944301 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.005950 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" path="/var/lib/kubelet/pods/db87482a-6aa4-49f8-ac16-b2aa288196ff/volumes" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.089087 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz256\" (UniqueName: \"kubernetes.io/projected/b1e4b601-cb61-433b-8e06-cbd920071fc5-kube-api-access-fz256\") pod \"b1e4b601-cb61-433b-8e06-cbd920071fc5\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.089361 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4w8h\" (UniqueName: \"kubernetes.io/projected/ff13878c-357b-4a39-8b6b-f4f6e1929fed-kube-api-access-r4w8h\") pod \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.089785 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4b601-cb61-433b-8e06-cbd920071fc5-operator-scripts\") pod \"b1e4b601-cb61-433b-8e06-cbd920071fc5\" (UID: \"b1e4b601-cb61-433b-8e06-cbd920071fc5\") " Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.089826 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff13878c-357b-4a39-8b6b-f4f6e1929fed-operator-scripts\") pod \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\" (UID: \"ff13878c-357b-4a39-8b6b-f4f6e1929fed\") " Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.091061 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1e4b601-cb61-433b-8e06-cbd920071fc5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1e4b601-cb61-433b-8e06-cbd920071fc5" (UID: "b1e4b601-cb61-433b-8e06-cbd920071fc5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.091060 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff13878c-357b-4a39-8b6b-f4f6e1929fed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff13878c-357b-4a39-8b6b-f4f6e1929fed" (UID: "ff13878c-357b-4a39-8b6b-f4f6e1929fed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.095712 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff13878c-357b-4a39-8b6b-f4f6e1929fed-kube-api-access-r4w8h" (OuterVolumeSpecName: "kube-api-access-r4w8h") pod "ff13878c-357b-4a39-8b6b-f4f6e1929fed" (UID: "ff13878c-357b-4a39-8b6b-f4f6e1929fed"). InnerVolumeSpecName "kube-api-access-r4w8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.096026 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e4b601-cb61-433b-8e06-cbd920071fc5-kube-api-access-fz256" (OuterVolumeSpecName: "kube-api-access-fz256") pod "b1e4b601-cb61-433b-8e06-cbd920071fc5" (UID: "b1e4b601-cb61-433b-8e06-cbd920071fc5"). InnerVolumeSpecName "kube-api-access-fz256". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.193441 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4w8h\" (UniqueName: \"kubernetes.io/projected/ff13878c-357b-4a39-8b6b-f4f6e1929fed-kube-api-access-r4w8h\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.193489 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1e4b601-cb61-433b-8e06-cbd920071fc5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.193505 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff13878c-357b-4a39-8b6b-f4f6e1929fed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.193520 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz256\" (UniqueName: \"kubernetes.io/projected/b1e4b601-cb61-433b-8e06-cbd920071fc5-kube-api-access-fz256\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.539017 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g8gpn" event={"ID":"b1e4b601-cb61-433b-8e06-cbd920071fc5","Type":"ContainerDied","Data":"d7ddb5b7920d5c572ef6940c2306541c93e2ceda7fcfdabcf2f81d50e11cfb81"} Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.539058 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g8gpn" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.539074 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7ddb5b7920d5c572ef6940c2306541c93e2ceda7fcfdabcf2f81d50e11cfb81" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.541849 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vwk86" Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.541848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vwk86" event={"ID":"ff13878c-357b-4a39-8b6b-f4f6e1929fed","Type":"ContainerDied","Data":"c077af9db265446b4355064abdf393cc012d36f42192837f86347e1e7aff2982"} Feb 21 22:02:22 crc kubenswrapper[4717]: I0221 22:02:22.541946 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c077af9db265446b4355064abdf393cc012d36f42192837f86347e1e7aff2982" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.577823 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hmzb4" event={"ID":"86a464ef-0333-4c29-a6a7-8af81a592e0b","Type":"ContainerDied","Data":"387f69c61d10d438838c6f4fb3d061c18679864893ddeb5f24000b415c3aa77a"} Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.578194 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="387f69c61d10d438838c6f4fb3d061c18679864893ddeb5f24000b415c3aa77a" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.580226 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.580332 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d9ae-account-create-update-69mhh" event={"ID":"040a8750-e237-47d5-8b7b-8f310c436b87","Type":"ContainerDied","Data":"e6017d24e06b47a37b702ff3ba519c26ee1a0787dfb9e993919e9417dd0de59b"} Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.580357 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6017d24e06b47a37b702ff3ba519c26ee1a0787dfb9e993919e9417dd0de59b" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.582769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b660-account-create-update-5brnn" event={"ID":"dd80c486-9397-43f2-ba5e-c6f868a2a47a","Type":"ContainerDied","Data":"72a54d9b718436cd6b90d1ab849559844b097c7473a6ee1dda7815ae40a6002b"} Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.582790 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72a54d9b718436cd6b90d1ab849559844b097c7473a6ee1dda7815ae40a6002b" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.585485 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e2b5-account-create-update-bq26b" event={"ID":"b7dda56c-0b65-46d4-9f88-dda7c73423da","Type":"ContainerDied","Data":"9dd3c013e4aac797c1861804442abb4e6cc8f5f1159f868c819c8fe145c0ab71"} Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.585527 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd3c013e4aac797c1861804442abb4e6cc8f5f1159f868c819c8fe145c0ab71" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.585791 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.596037 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.632686 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.643975 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79jwt\" (UniqueName: \"kubernetes.io/projected/040a8750-e237-47d5-8b7b-8f310c436b87-kube-api-access-79jwt\") pod \"040a8750-e237-47d5-8b7b-8f310c436b87\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644054 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7dda56c-0b65-46d4-9f88-dda7c73423da-operator-scripts\") pod \"b7dda56c-0b65-46d4-9f88-dda7c73423da\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644151 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-762fv\" (UniqueName: \"kubernetes.io/projected/86a464ef-0333-4c29-a6a7-8af81a592e0b-kube-api-access-762fv\") pod \"86a464ef-0333-4c29-a6a7-8af81a592e0b\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644188 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xp8j\" (UniqueName: \"kubernetes.io/projected/b7dda56c-0b65-46d4-9f88-dda7c73423da-kube-api-access-4xp8j\") pod \"b7dda56c-0b65-46d4-9f88-dda7c73423da\" (UID: \"b7dda56c-0b65-46d4-9f88-dda7c73423da\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644220 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86a464ef-0333-4c29-a6a7-8af81a592e0b-operator-scripts\") pod \"86a464ef-0333-4c29-a6a7-8af81a592e0b\" (UID: \"86a464ef-0333-4c29-a6a7-8af81a592e0b\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644242 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd80c486-9397-43f2-ba5e-c6f868a2a47a-operator-scripts\") pod \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cht4\" (UniqueName: \"kubernetes.io/projected/dd80c486-9397-43f2-ba5e-c6f868a2a47a-kube-api-access-2cht4\") pod \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\" (UID: \"dd80c486-9397-43f2-ba5e-c6f868a2a47a\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.644339 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/040a8750-e237-47d5-8b7b-8f310c436b87-operator-scripts\") pod \"040a8750-e237-47d5-8b7b-8f310c436b87\" (UID: \"040a8750-e237-47d5-8b7b-8f310c436b87\") " Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.645360 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040a8750-e237-47d5-8b7b-8f310c436b87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "040a8750-e237-47d5-8b7b-8f310c436b87" (UID: "040a8750-e237-47d5-8b7b-8f310c436b87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.648626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd80c486-9397-43f2-ba5e-c6f868a2a47a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd80c486-9397-43f2-ba5e-c6f868a2a47a" (UID: "dd80c486-9397-43f2-ba5e-c6f868a2a47a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.649343 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a464ef-0333-4c29-a6a7-8af81a592e0b-kube-api-access-762fv" (OuterVolumeSpecName: "kube-api-access-762fv") pod "86a464ef-0333-4c29-a6a7-8af81a592e0b" (UID: "86a464ef-0333-4c29-a6a7-8af81a592e0b"). InnerVolumeSpecName "kube-api-access-762fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.650133 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7dda56c-0b65-46d4-9f88-dda7c73423da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7dda56c-0b65-46d4-9f88-dda7c73423da" (UID: "b7dda56c-0b65-46d4-9f88-dda7c73423da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.650990 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040a8750-e237-47d5-8b7b-8f310c436b87-kube-api-access-79jwt" (OuterVolumeSpecName: "kube-api-access-79jwt") pod "040a8750-e237-47d5-8b7b-8f310c436b87" (UID: "040a8750-e237-47d5-8b7b-8f310c436b87"). InnerVolumeSpecName "kube-api-access-79jwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.651387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7dda56c-0b65-46d4-9f88-dda7c73423da-kube-api-access-4xp8j" (OuterVolumeSpecName: "kube-api-access-4xp8j") pod "b7dda56c-0b65-46d4-9f88-dda7c73423da" (UID: "b7dda56c-0b65-46d4-9f88-dda7c73423da"). InnerVolumeSpecName "kube-api-access-4xp8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.655357 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a464ef-0333-4c29-a6a7-8af81a592e0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86a464ef-0333-4c29-a6a7-8af81a592e0b" (UID: "86a464ef-0333-4c29-a6a7-8af81a592e0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.659180 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd80c486-9397-43f2-ba5e-c6f868a2a47a-kube-api-access-2cht4" (OuterVolumeSpecName: "kube-api-access-2cht4") pod "dd80c486-9397-43f2-ba5e-c6f868a2a47a" (UID: "dd80c486-9397-43f2-ba5e-c6f868a2a47a"). InnerVolumeSpecName "kube-api-access-2cht4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745909 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xp8j\" (UniqueName: \"kubernetes.io/projected/b7dda56c-0b65-46d4-9f88-dda7c73423da-kube-api-access-4xp8j\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745939 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86a464ef-0333-4c29-a6a7-8af81a592e0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745948 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd80c486-9397-43f2-ba5e-c6f868a2a47a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745958 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cht4\" (UniqueName: \"kubernetes.io/projected/dd80c486-9397-43f2-ba5e-c6f868a2a47a-kube-api-access-2cht4\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745967 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/040a8750-e237-47d5-8b7b-8f310c436b87-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745978 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79jwt\" (UniqueName: \"kubernetes.io/projected/040a8750-e237-47d5-8b7b-8f310c436b87-kube-api-access-79jwt\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745989 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7dda56c-0b65-46d4-9f88-dda7c73423da-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:24.745996 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-762fv\" (UniqueName: \"kubernetes.io/projected/86a464ef-0333-4c29-a6a7-8af81a592e0b-kube-api-access-762fv\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:25.596544 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b660-account-create-update-5brnn" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:25.597982 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7m8mk" event={"ID":"99b03743-48fc-4006-8cfd-b912deba0232","Type":"ContainerStarted","Data":"73880c7cdb898cf7e7d9bcced9ef2f2fa003fe97c191d6135c470578c7ae90f6"} Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:25.598078 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e2b5-account-create-update-bq26b" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:25.601973 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hmzb4" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:25.602044 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d9ae-account-create-update-69mhh" Feb 21 22:02:25 crc kubenswrapper[4717]: I0221 22:02:25.632018 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7m8mk" podStartSLOduration=8.250773791 podStartE2EDuration="12.631986802s" podCreationTimestamp="2026-02-21 22:02:13 +0000 UTC" firstStartedPulling="2026-02-21 22:02:20.114023 +0000 UTC m=+954.895556622" lastFinishedPulling="2026-02-21 22:02:24.495235981 +0000 UTC m=+959.276769633" observedRunningTime="2026-02-21 22:02:25.623341495 +0000 UTC m=+960.404875117" watchObservedRunningTime="2026-02-21 22:02:25.631986802 +0000 UTC m=+960.413520494" Feb 21 22:02:27 crc kubenswrapper[4717]: I0221 22:02:27.627763 4717 generic.go:334] "Generic (PLEG): container finished" podID="99b03743-48fc-4006-8cfd-b912deba0232" containerID="73880c7cdb898cf7e7d9bcced9ef2f2fa003fe97c191d6135c470578c7ae90f6" exitCode=0 Feb 21 22:02:27 crc kubenswrapper[4717]: I0221 22:02:27.627909 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7m8mk" event={"ID":"99b03743-48fc-4006-8cfd-b912deba0232","Type":"ContainerDied","Data":"73880c7cdb898cf7e7d9bcced9ef2f2fa003fe97c191d6135c470578c7ae90f6"} Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.044893 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.119204 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-combined-ca-bundle\") pod \"99b03743-48fc-4006-8cfd-b912deba0232\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.119274 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-config-data\") pod \"99b03743-48fc-4006-8cfd-b912deba0232\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.119350 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5km2l\" (UniqueName: \"kubernetes.io/projected/99b03743-48fc-4006-8cfd-b912deba0232-kube-api-access-5km2l\") pod \"99b03743-48fc-4006-8cfd-b912deba0232\" (UID: \"99b03743-48fc-4006-8cfd-b912deba0232\") " Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.164409 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99b03743-48fc-4006-8cfd-b912deba0232" (UID: "99b03743-48fc-4006-8cfd-b912deba0232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.168529 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b03743-48fc-4006-8cfd-b912deba0232-kube-api-access-5km2l" (OuterVolumeSpecName: "kube-api-access-5km2l") pod "99b03743-48fc-4006-8cfd-b912deba0232" (UID: "99b03743-48fc-4006-8cfd-b912deba0232"). InnerVolumeSpecName "kube-api-access-5km2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.186841 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-knh79"] Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187226 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="extract-content" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187246 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="extract-content" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187262 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7dda56c-0b65-46d4-9f88-dda7c73423da" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187273 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7dda56c-0b65-46d4-9f88-dda7c73423da" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187290 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd80c486-9397-43f2-ba5e-c6f868a2a47a" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187299 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd80c486-9397-43f2-ba5e-c6f868a2a47a" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187315 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a464ef-0333-4c29-a6a7-8af81a592e0b" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187324 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a464ef-0333-4c29-a6a7-8af81a592e0b" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187341 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e4b601-cb61-433b-8e06-cbd920071fc5" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187350 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e4b601-cb61-433b-8e06-cbd920071fc5" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187369 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b03743-48fc-4006-8cfd-b912deba0232" containerName="keystone-db-sync" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187377 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b03743-48fc-4006-8cfd-b912deba0232" containerName="keystone-db-sync" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187392 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="dnsmasq-dns" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187400 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="dnsmasq-dns" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187418 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff13878c-357b-4a39-8b6b-f4f6e1929fed" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187428 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff13878c-357b-4a39-8b6b-f4f6e1929fed" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187442 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="extract-utilities" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187451 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="extract-utilities" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187467 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="init" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187475 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="init" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187487 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="registry-server" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187495 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="registry-server" Feb 21 22:02:29 crc kubenswrapper[4717]: E0221 22:02:29.187512 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040a8750-e237-47d5-8b7b-8f310c436b87" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187520 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="040a8750-e237-47d5-8b7b-8f310c436b87" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187717 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7dda56c-0b65-46d4-9f88-dda7c73423da" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187732 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd80c486-9397-43f2-ba5e-c6f868a2a47a" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187747 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a464ef-0333-4c29-a6a7-8af81a592e0b" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187762 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff13878c-357b-4a39-8b6b-f4f6e1929fed" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187771 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b03743-48fc-4006-8cfd-b912deba0232" containerName="keystone-db-sync" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187787 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e4b601-cb61-433b-8e06-cbd920071fc5" containerName="mariadb-database-create" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187807 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="040a8750-e237-47d5-8b7b-8f310c436b87" containerName="mariadb-account-create-update" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187825 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea234932-2730-4dfa-9e21-91bc7575a885" containerName="dnsmasq-dns" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.187838 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db87482a-6aa4-49f8-ac16-b2aa288196ff" containerName="registry-server" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.189534 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.223956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bsd2\" (UniqueName: \"kubernetes.io/projected/b0ef6389-987a-492e-8324-8b88a70f659f-kube-api-access-5bsd2\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.224021 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-utilities\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.224055 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-catalog-content\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.224148 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5km2l\" (UniqueName: \"kubernetes.io/projected/99b03743-48fc-4006-8cfd-b912deba0232-kube-api-access-5km2l\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.224163 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.227106 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-config-data" (OuterVolumeSpecName: "config-data") pod "99b03743-48fc-4006-8cfd-b912deba0232" (UID: "99b03743-48fc-4006-8cfd-b912deba0232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.230759 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knh79"] Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.325016 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-catalog-content\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.325568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bsd2\" (UniqueName: \"kubernetes.io/projected/b0ef6389-987a-492e-8324-8b88a70f659f-kube-api-access-5bsd2\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.325613 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-utilities\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.325655 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b03743-48fc-4006-8cfd-b912deba0232-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.325855 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-utilities\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.325471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-catalog-content\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.342455 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bsd2\" (UniqueName: \"kubernetes.io/projected/b0ef6389-987a-492e-8324-8b88a70f659f-kube-api-access-5bsd2\") pod \"community-operators-knh79\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.583740 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.645696 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7m8mk" event={"ID":"99b03743-48fc-4006-8cfd-b912deba0232","Type":"ContainerDied","Data":"69d13a742739e189c4cc86873988b5be3e5ab7365128281169ea2fa991bac80e"} Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.645986 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69d13a742739e189c4cc86873988b5be3e5ab7365128281169ea2fa991bac80e" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.646003 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7m8mk" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.916142 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g8kzg"] Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.918101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.922306 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g8kzg"] Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.929936 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k4946"] Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.930926 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.935021 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.935263 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.935415 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lfvb4" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.935552 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.935772 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943721 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-scripts\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943765 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-credential-keys\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943788 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-svc\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943829 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qzl\" (UniqueName: \"kubernetes.io/projected/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-kube-api-access-b4qzl\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943871 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-config-data\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943905 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-fernet-keys\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-config\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-combined-ca-bundle\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.943977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.944001 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnltz\" (UniqueName: \"kubernetes.io/projected/f087f525-4007-4f12-b4e0-89e5d6b4eafb-kube-api-access-nnltz\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.944023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:29 crc kubenswrapper[4717]: I0221 22:02:29.956312 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k4946"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047205 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnltz\" (UniqueName: \"kubernetes.io/projected/f087f525-4007-4f12-b4e0-89e5d6b4eafb-kube-api-access-nnltz\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047562 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-scripts\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-credential-keys\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047637 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-svc\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047678 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qzl\" (UniqueName: \"kubernetes.io/projected/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-kube-api-access-b4qzl\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-config-data\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047793 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-fernet-keys\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047831 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-config\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.047849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-combined-ca-bundle\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.048468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.050684 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.051508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-config\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.052132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-svc\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.053234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.053821 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.057053 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knh79"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.068808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-combined-ca-bundle\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.078779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qzl\" (UniqueName: \"kubernetes.io/projected/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-kube-api-access-b4qzl\") pod \"dnsmasq-dns-55fff446b9-g8kzg\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.084095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-config-data\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.090715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-fernet-keys\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.096479 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnltz\" (UniqueName: \"kubernetes.io/projected/f087f525-4007-4f12-b4e0-89e5d6b4eafb-kube-api-access-nnltz\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.099200 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-scripts\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.102785 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-credential-keys\") pod \"keystone-bootstrap-k4946\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.130164 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b5d497dbf-bkvz6"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.131465 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.140555 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.140725 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.140836 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.140953 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gbcmb" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.149559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09f86cea-494d-4f11-b9f5-2045f7aabd92-horizon-secret-key\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.149636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-scripts\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.149690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-config-data\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.149717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f86cea-494d-4f11-b9f5-2045f7aabd92-logs\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.149735 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pchpc\" (UniqueName: \"kubernetes.io/projected/09f86cea-494d-4f11-b9f5-2045f7aabd92-kube-api-access-pchpc\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.164136 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-v94db"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.171554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.184359 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.184545 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.184642 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-x55b2" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.185099 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b5d497dbf-bkvz6"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.217747 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v94db"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.230958 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6xk4j"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.232009 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.244885 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q22pl" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.245069 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.245267 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251815 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-scripts\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-combined-ca-bundle\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251904 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-combined-ca-bundle\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-db-sync-config-data\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-config-data\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251976 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-scripts\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.251997 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f86cea-494d-4f11-b9f5-2045f7aabd92-logs\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.252015 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pchpc\" (UniqueName: \"kubernetes.io/projected/09f86cea-494d-4f11-b9f5-2045f7aabd92-kube-api-access-pchpc\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.252039 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-config-data\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.252067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-config\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.252087 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfwg\" (UniqueName: \"kubernetes.io/projected/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-kube-api-access-2mfwg\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.253118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3727ff36-57dd-4c91-ab08-d5c87ee4e357-etc-machine-id\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.253245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7jlg\" (UniqueName: \"kubernetes.io/projected/3727ff36-57dd-4c91-ab08-d5c87ee4e357-kube-api-access-f7jlg\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.253328 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09f86cea-494d-4f11-b9f5-2045f7aabd92-horizon-secret-key\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.253724 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-config-data\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.254142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-scripts\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.254370 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f86cea-494d-4f11-b9f5-2045f7aabd92-logs\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.254628 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g8kzg"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.255158 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.276225 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zsxs4"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.277224 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.288037 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6476r" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.288220 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.288580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09f86cea-494d-4f11-b9f5-2045f7aabd92-horizon-secret-key\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.288633 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.293570 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6xk4j"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.338209 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zsxs4"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.344653 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.347465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pchpc\" (UniqueName: \"kubernetes.io/projected/09f86cea-494d-4f11-b9f5-2045f7aabd92-kube-api-access-pchpc\") pod \"horizon-5b5d497dbf-bkvz6\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3727ff36-57dd-4c91-ab08-d5c87ee4e357-etc-machine-id\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370534 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-combined-ca-bundle\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370586 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7jlg\" (UniqueName: \"kubernetes.io/projected/3727ff36-57dd-4c91-ab08-d5c87ee4e357-kube-api-access-f7jlg\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370615 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370680 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3727ff36-57dd-4c91-ab08-d5c87ee4e357-etc-machine-id\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370624 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2tz\" (UniqueName: \"kubernetes.io/projected/d38c89d0-4315-4d98-86bc-570662736bba-kube-api-access-6p2tz\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.370998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-combined-ca-bundle\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-combined-ca-bundle\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371086 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-scripts\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371111 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-db-sync-config-data\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38c89d0-4315-4d98-86bc-570662736bba-logs\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-scripts\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-config-data\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-config\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-config-data\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.371305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfwg\" (UniqueName: \"kubernetes.io/projected/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-kube-api-access-2mfwg\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.397962 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-config-data\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.400831 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-config\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.401184 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-db-sync-config-data\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.403948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-scripts\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.413024 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mp7ws"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.413560 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-combined-ca-bundle\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.424937 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-combined-ca-bundle\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.426830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.427567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfwg\" (UniqueName: \"kubernetes.io/projected/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-kube-api-access-2mfwg\") pod \"neutron-db-sync-v94db\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.428157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7jlg\" (UniqueName: \"kubernetes.io/projected/3727ff36-57dd-4c91-ab08-d5c87ee4e357-kube-api-access-f7jlg\") pod \"cinder-db-sync-6xk4j\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.431534 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.475844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38c89d0-4315-4d98-86bc-570662736bba-logs\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.475954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-config-data\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.475998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-combined-ca-bundle\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.476039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2tz\" (UniqueName: \"kubernetes.io/projected/d38c89d0-4315-4d98-86bc-570662736bba-kube-api-access-6p2tz\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.476114 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-scripts\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.481939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38c89d0-4315-4d98-86bc-570662736bba-logs\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.489191 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-combined-ca-bundle\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.493503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-scripts\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.508379 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2tz\" (UniqueName: \"kubernetes.io/projected/d38c89d0-4315-4d98-86bc-570662736bba-kube-api-access-6p2tz\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.508846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-config-data\") pod \"placement-db-sync-zsxs4\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.552031 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mp7ws"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.577640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.577697 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jql8m\" (UniqueName: \"kubernetes.io/projected/deeb3ad3-4fe8-4faf-9307-5da9988002f6-kube-api-access-jql8m\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.577756 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.577777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.577804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.577845 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-config\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.580459 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2h82m"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.581509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.585787 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mpvs9" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.586153 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.593984 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7ff8b8ffdf-4dwxk"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.597200 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.632926 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.634848 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.638151 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.638414 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.650008 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2h82m"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.658386 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.668755 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ff8b8ffdf-4dwxk"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679429 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4cc\" (UniqueName: \"kubernetes.io/projected/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-kube-api-access-kq4cc\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-config-data\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-config-data\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679556 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhmn\" (UniqueName: \"kubernetes.io/projected/439eab0e-0489-4a97-993e-c6c3df03e694-kube-api-access-vdhmn\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679584 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jql8m\" (UniqueName: \"kubernetes.io/projected/deeb3ad3-4fe8-4faf-9307-5da9988002f6-kube-api-access-jql8m\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679602 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439eab0e-0489-4a97-993e-c6c3df03e694-logs\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-scripts\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679642 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-run-httpd\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679715 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679730 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/439eab0e-0489-4a97-993e-c6c3df03e694-horizon-secret-key\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679747 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2n6z\" (UniqueName: \"kubernetes.io/projected/a945001c-fdf1-4bda-8012-3df96d9781ce-kube-api-access-s2n6z\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679765 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679805 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-scripts\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679825 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-log-httpd\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-config\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-db-sync-config-data\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.679921 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-combined-ca-bundle\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.681011 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.681519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.681556 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-config\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.682195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.682285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.701345 4717 generic.go:334] "Generic (PLEG): container finished" podID="b0ef6389-987a-492e-8324-8b88a70f659f" containerID="7df5fcbeecc7602ad0212f057ba7b16afc2cfac946b7127d9d82fd27bafadfdd" exitCode=0 Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.701390 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knh79" event={"ID":"b0ef6389-987a-492e-8324-8b88a70f659f","Type":"ContainerDied","Data":"7df5fcbeecc7602ad0212f057ba7b16afc2cfac946b7127d9d82fd27bafadfdd"} Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.701420 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knh79" event={"ID":"b0ef6389-987a-492e-8324-8b88a70f659f","Type":"ContainerStarted","Data":"2f976312bb0556b7c6a13ca90f3e55f9b57869a915bda10828b9e280adaee9bf"} Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.701813 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jql8m\" (UniqueName: \"kubernetes.io/projected/deeb3ad3-4fe8-4faf-9307-5da9988002f6-kube-api-access-jql8m\") pod \"dnsmasq-dns-76fcf4b695-mp7ws\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.706152 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.759085 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsxs4" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/439eab0e-0489-4a97-993e-c6c3df03e694-horizon-secret-key\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781291 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2n6z\" (UniqueName: \"kubernetes.io/projected/a945001c-fdf1-4bda-8012-3df96d9781ce-kube-api-access-s2n6z\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781317 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-scripts\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-log-httpd\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781410 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-db-sync-config-data\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-combined-ca-bundle\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781449 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4cc\" (UniqueName: \"kubernetes.io/projected/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-kube-api-access-kq4cc\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781476 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-config-data\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-config-data\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhmn\" (UniqueName: \"kubernetes.io/projected/439eab0e-0489-4a97-993e-c6c3df03e694-kube-api-access-vdhmn\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439eab0e-0489-4a97-993e-c6c3df03e694-logs\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781577 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-scripts\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781591 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.781613 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-run-httpd\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.785648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-combined-ca-bundle\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.787028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-config-data\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.790755 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439eab0e-0489-4a97-993e-c6c3df03e694-logs\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.794157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-run-httpd\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.798446 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/439eab0e-0489-4a97-993e-c6c3df03e694-horizon-secret-key\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.800126 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-log-httpd\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.803136 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-scripts\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.803209 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.804562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.808427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-config-data\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.809671 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-db-sync-config-data\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.812839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4cc\" (UniqueName: \"kubernetes.io/projected/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-kube-api-access-kq4cc\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.816917 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhmn\" (UniqueName: \"kubernetes.io/projected/439eab0e-0489-4a97-993e-c6c3df03e694-kube-api-access-vdhmn\") pod \"horizon-7ff8b8ffdf-4dwxk\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.817387 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2n6z\" (UniqueName: \"kubernetes.io/projected/a945001c-fdf1-4bda-8012-3df96d9781ce-kube-api-access-s2n6z\") pod \"barbican-db-sync-2h82m\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.818175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-scripts\") pod \"ceilometer-0\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " pod="openstack/ceilometer-0" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.899480 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g8kzg"] Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.908994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.944834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2h82m" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.975484 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:02:30 crc kubenswrapper[4717]: I0221 22:02:30.988995 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.116434 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6xk4j"] Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.121914 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b5d497dbf-bkvz6"] Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.164769 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k4946"] Feb 21 22:02:31 crc kubenswrapper[4717]: W0221 22:02:31.170311 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09f86cea_494d_4f11_b9f5_2045f7aabd92.slice/crio-5398f26e45c766dbab1cd4b70b92b07420e8879318c1c410c2e94aeffc461af3 WatchSource:0}: Error finding container 5398f26e45c766dbab1cd4b70b92b07420e8879318c1c410c2e94aeffc461af3: Status 404 returned error can't find the container with id 5398f26e45c766dbab1cd4b70b92b07420e8879318c1c410c2e94aeffc461af3 Feb 21 22:02:31 crc kubenswrapper[4717]: W0221 22:02:31.181561 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3727ff36_57dd_4c91_ab08_d5c87ee4e357.slice/crio-f4effad02a5e29ffefe59a3b8e1ac69d5ebe492203fa2ec0b1e88da0db58a205 WatchSource:0}: Error finding container f4effad02a5e29ffefe59a3b8e1ac69d5ebe492203fa2ec0b1e88da0db58a205: Status 404 returned error can't find the container with id f4effad02a5e29ffefe59a3b8e1ac69d5ebe492203fa2ec0b1e88da0db58a205 Feb 21 22:02:31 crc kubenswrapper[4717]: W0221 22:02:31.185744 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf087f525_4007_4f12_b4e0_89e5d6b4eafb.slice/crio-6acd987bf9cbda174bb61d50245721be3767fa94f3ab14bba6d6394c6650739e WatchSource:0}: Error finding container 6acd987bf9cbda174bb61d50245721be3767fa94f3ab14bba6d6394c6650739e: Status 404 returned error can't find the container with id 6acd987bf9cbda174bb61d50245721be3767fa94f3ab14bba6d6394c6650739e Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.453473 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zsxs4"] Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.492755 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v94db"] Feb 21 22:02:31 crc kubenswrapper[4717]: W0221 22:02:31.493853 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd38c89d0_4315_4d98_86bc_570662736bba.slice/crio-977a4e01fcc164859fc4446b4123b574c70a1efb170e5a3ad793cf4a6f1391e4 WatchSource:0}: Error finding container 977a4e01fcc164859fc4446b4123b574c70a1efb170e5a3ad793cf4a6f1391e4: Status 404 returned error can't find the container with id 977a4e01fcc164859fc4446b4123b574c70a1efb170e5a3ad793cf4a6f1391e4 Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.750063 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v94db" event={"ID":"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7","Type":"ContainerStarted","Data":"76274386de511323bd2b8b330ff66a076bef4cf47c44dabdacc3ef3b89bee5fb"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.773504 4717 generic.go:334] "Generic (PLEG): container finished" podID="4a1ef093-6ae0-4b7d-b692-f70ac2eab521" containerID="464b865681ddec4bbd57543b49c16425fb51dc764a580da852a5ca24e51fbd64" exitCode=0 Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.773809 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" event={"ID":"4a1ef093-6ae0-4b7d-b692-f70ac2eab521","Type":"ContainerDied","Data":"464b865681ddec4bbd57543b49c16425fb51dc764a580da852a5ca24e51fbd64"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.773875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" event={"ID":"4a1ef093-6ae0-4b7d-b692-f70ac2eab521","Type":"ContainerStarted","Data":"955e5209269a4f7284518b27760ecc0dcc59c380cd6726d8cfd68666804ce5cd"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.777805 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mp7ws"] Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.789737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsxs4" event={"ID":"d38c89d0-4315-4d98-86bc-570662736bba","Type":"ContainerStarted","Data":"977a4e01fcc164859fc4446b4123b574c70a1efb170e5a3ad793cf4a6f1391e4"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.819923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4946" event={"ID":"f087f525-4007-4f12-b4e0-89e5d6b4eafb","Type":"ContainerStarted","Data":"fc5d9bc25940a3a0e67e4bd13fbf937a13b921f9b8084b93ebc9d58c63b41247"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.820104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4946" event={"ID":"f087f525-4007-4f12-b4e0-89e5d6b4eafb","Type":"ContainerStarted","Data":"6acd987bf9cbda174bb61d50245721be3767fa94f3ab14bba6d6394c6650739e"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.824885 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2h82m"] Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.835502 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5d497dbf-bkvz6" event={"ID":"09f86cea-494d-4f11-b9f5-2045f7aabd92","Type":"ContainerStarted","Data":"5398f26e45c766dbab1cd4b70b92b07420e8879318c1c410c2e94aeffc461af3"} Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.839007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6xk4j" event={"ID":"3727ff36-57dd-4c91-ab08-d5c87ee4e357","Type":"ContainerStarted","Data":"f4effad02a5e29ffefe59a3b8e1ac69d5ebe492203fa2ec0b1e88da0db58a205"} Feb 21 22:02:31 crc kubenswrapper[4717]: E0221 22:02:31.846872 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a1ef093_6ae0_4b7d_b692_f70ac2eab521.slice/crio-conmon-464b865681ddec4bbd57543b49c16425fb51dc764a580da852a5ca24e51fbd64.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a1ef093_6ae0_4b7d_b692_f70ac2eab521.slice/crio-464b865681ddec4bbd57543b49c16425fb51dc764a580da852a5ca24e51fbd64.scope\": RecentStats: unable to find data in memory cache]" Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.899689 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k4946" podStartSLOduration=2.8996715589999997 podStartE2EDuration="2.899671559s" podCreationTimestamp="2026-02-21 22:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:31.85045899 +0000 UTC m=+966.631992612" watchObservedRunningTime="2026-02-21 22:02:31.899671559 +0000 UTC m=+966.681205181" Feb 21 22:02:31 crc kubenswrapper[4717]: I0221 22:02:31.939404 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:02:31 crc kubenswrapper[4717]: W0221 22:02:31.945890 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ff6e9c3_c94b_43ec_bba0_6e180be99f9e.slice/crio-310da457da96fdb6e65cb1262e97f3ee1bbc39cc77fb5e92d5eca3e4d2723b81 WatchSource:0}: Error finding container 310da457da96fdb6e65cb1262e97f3ee1bbc39cc77fb5e92d5eca3e4d2723b81: Status 404 returned error can't find the container with id 310da457da96fdb6e65cb1262e97f3ee1bbc39cc77fb5e92d5eca3e4d2723b81 Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.082813 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ff8b8ffdf-4dwxk"] Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.320277 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.419484 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-svc\") pod \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.419577 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4qzl\" (UniqueName: \"kubernetes.io/projected/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-kube-api-access-b4qzl\") pod \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.419642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-nb\") pod \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.419666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-sb\") pod \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.419698 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-config\") pod \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.419741 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-swift-storage-0\") pod \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\" (UID: \"4a1ef093-6ae0-4b7d-b692-f70ac2eab521\") " Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.429203 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-kube-api-access-b4qzl" (OuterVolumeSpecName: "kube-api-access-b4qzl") pod "4a1ef093-6ae0-4b7d-b692-f70ac2eab521" (UID: "4a1ef093-6ae0-4b7d-b692-f70ac2eab521"). InnerVolumeSpecName "kube-api-access-b4qzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.448849 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a1ef093-6ae0-4b7d-b692-f70ac2eab521" (UID: "4a1ef093-6ae0-4b7d-b692-f70ac2eab521"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.450583 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a1ef093-6ae0-4b7d-b692-f70ac2eab521" (UID: "4a1ef093-6ae0-4b7d-b692-f70ac2eab521"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.460425 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a1ef093-6ae0-4b7d-b692-f70ac2eab521" (UID: "4a1ef093-6ae0-4b7d-b692-f70ac2eab521"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.462226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a1ef093-6ae0-4b7d-b692-f70ac2eab521" (UID: "4a1ef093-6ae0-4b7d-b692-f70ac2eab521"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.467678 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-config" (OuterVolumeSpecName: "config") pod "4a1ef093-6ae0-4b7d-b692-f70ac2eab521" (UID: "4a1ef093-6ae0-4b7d-b692-f70ac2eab521"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.524095 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.524126 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.524136 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.524144 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.524153 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4qzl\" (UniqueName: \"kubernetes.io/projected/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-kube-api-access-b4qzl\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.524163 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a1ef093-6ae0-4b7d-b692-f70ac2eab521-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.849790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ff8b8ffdf-4dwxk" event={"ID":"439eab0e-0489-4a97-993e-c6c3df03e694","Type":"ContainerStarted","Data":"87885ae6b97823f2265aeba9f16a62d9488d237182d9d1b42f2afe2d43f98260"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.852455 4717 generic.go:334] "Generic (PLEG): container finished" podID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerID="65e9cfd12ba025bf858dc4e8fb25c4b474b986d65007e857520136d028b3244c" exitCode=0 Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.852550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" event={"ID":"deeb3ad3-4fe8-4faf-9307-5da9988002f6","Type":"ContainerDied","Data":"65e9cfd12ba025bf858dc4e8fb25c4b474b986d65007e857520136d028b3244c"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.852589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" event={"ID":"deeb3ad3-4fe8-4faf-9307-5da9988002f6","Type":"ContainerStarted","Data":"4165cf23f8e72f6893a95b8590e85b0cde2cac25c2304d1d7e89f9b04e628e85"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.856080 4717 generic.go:334] "Generic (PLEG): container finished" podID="b0ef6389-987a-492e-8324-8b88a70f659f" containerID="843981a6151052a7bdd420116752c5f52aa9283a4ab543709c17e30d9c56bb80" exitCode=0 Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.856471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knh79" event={"ID":"b0ef6389-987a-492e-8324-8b88a70f659f","Type":"ContainerDied","Data":"843981a6151052a7bdd420116752c5f52aa9283a4ab543709c17e30d9c56bb80"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.861105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v94db" event={"ID":"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7","Type":"ContainerStarted","Data":"1712bb4801ea35250b75861058b9239da153385e4b26f1deed32e14c620ce1cc"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.871198 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerStarted","Data":"310da457da96fdb6e65cb1262e97f3ee1bbc39cc77fb5e92d5eca3e4d2723b81"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.875726 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" event={"ID":"4a1ef093-6ae0-4b7d-b692-f70ac2eab521","Type":"ContainerDied","Data":"955e5209269a4f7284518b27760ecc0dcc59c380cd6726d8cfd68666804ce5cd"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.875794 4717 scope.go:117] "RemoveContainer" containerID="464b865681ddec4bbd57543b49c16425fb51dc764a580da852a5ca24e51fbd64" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.875925 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-g8kzg" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.879933 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2h82m" event={"ID":"a945001c-fdf1-4bda-8012-3df96d9781ce","Type":"ContainerStarted","Data":"49750635f2e50d64d2b499f091f7be35db00c3f12d0617cddc3484c1144cd48c"} Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.927578 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-v94db" podStartSLOduration=2.927531291 podStartE2EDuration="2.927531291s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:32.911760173 +0000 UTC m=+967.693293785" watchObservedRunningTime="2026-02-21 22:02:32.927531291 +0000 UTC m=+967.709064923" Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.995131 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g8kzg"] Feb 21 22:02:32 crc kubenswrapper[4717]: I0221 22:02:32.995171 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-g8kzg"] Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.254155 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b5d497dbf-bkvz6"] Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.266978 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.281450 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c6f587885-vvp7f"] Feb 21 22:02:33 crc kubenswrapper[4717]: E0221 22:02:33.282024 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1ef093-6ae0-4b7d-b692-f70ac2eab521" containerName="init" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.282040 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1ef093-6ae0-4b7d-b692-f70ac2eab521" containerName="init" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.282183 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1ef093-6ae0-4b7d-b692-f70ac2eab521" containerName="init" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.284700 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.297743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c6f587885-vvp7f"] Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.351873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-scripts\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.351969 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqm7c\" (UniqueName: \"kubernetes.io/projected/ea65478f-2244-4670-9821-526d00eb1b9a-kube-api-access-fqm7c\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.352021 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea65478f-2244-4670-9821-526d00eb1b9a-logs\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.352041 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea65478f-2244-4670-9821-526d00eb1b9a-horizon-secret-key\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.352113 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-config-data\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.453694 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea65478f-2244-4670-9821-526d00eb1b9a-horizon-secret-key\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.453762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-config-data\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.453816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-scripts\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.453899 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqm7c\" (UniqueName: \"kubernetes.io/projected/ea65478f-2244-4670-9821-526d00eb1b9a-kube-api-access-fqm7c\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.453927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea65478f-2244-4670-9821-526d00eb1b9a-logs\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.454502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea65478f-2244-4670-9821-526d00eb1b9a-logs\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.454999 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-scripts\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.455312 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-config-data\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.477694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea65478f-2244-4670-9821-526d00eb1b9a-horizon-secret-key\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.480902 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqm7c\" (UniqueName: \"kubernetes.io/projected/ea65478f-2244-4670-9821-526d00eb1b9a-kube-api-access-fqm7c\") pod \"horizon-5c6f587885-vvp7f\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.601019 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.934841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" event={"ID":"deeb3ad3-4fe8-4faf-9307-5da9988002f6","Type":"ContainerStarted","Data":"2c9ed442ca6b3107ad9b7938dd2a39e91b15d48bb914f9df805405565a093e25"} Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.936019 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:33 crc kubenswrapper[4717]: I0221 22:02:33.971182 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" podStartSLOduration=3.971167021 podStartE2EDuration="3.971167021s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:02:33.970139216 +0000 UTC m=+968.751672828" watchObservedRunningTime="2026-02-21 22:02:33.971167021 +0000 UTC m=+968.752700643" Feb 21 22:02:34 crc kubenswrapper[4717]: I0221 22:02:34.009969 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1ef093-6ae0-4b7d-b692-f70ac2eab521" path="/var/lib/kubelet/pods/4a1ef093-6ae0-4b7d-b692-f70ac2eab521/volumes" Feb 21 22:02:34 crc kubenswrapper[4717]: I0221 22:02:34.161604 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c6f587885-vvp7f"] Feb 21 22:02:34 crc kubenswrapper[4717]: W0221 22:02:34.178042 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea65478f_2244_4670_9821_526d00eb1b9a.slice/crio-85ab1d9e268f6662ef9f77d4dfe144b0aa9f409bbff245295e0f507ea52b2837 WatchSource:0}: Error finding container 85ab1d9e268f6662ef9f77d4dfe144b0aa9f409bbff245295e0f507ea52b2837: Status 404 returned error can't find the container with id 85ab1d9e268f6662ef9f77d4dfe144b0aa9f409bbff245295e0f507ea52b2837 Feb 21 22:02:34 crc kubenswrapper[4717]: I0221 22:02:34.976637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6f587885-vvp7f" event={"ID":"ea65478f-2244-4670-9821-526d00eb1b9a","Type":"ContainerStarted","Data":"85ab1d9e268f6662ef9f77d4dfe144b0aa9f409bbff245295e0f507ea52b2837"} Feb 21 22:02:35 crc kubenswrapper[4717]: I0221 22:02:35.992814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ttdv" event={"ID":"7995c515-84a0-44d3-82e8-99a2ab1fb7b2","Type":"ContainerStarted","Data":"138d13f715b4f6a86656eb38c67929a0c8ee1d111fc73febb47ebc73e54203fa"} Feb 21 22:02:35 crc kubenswrapper[4717]: I0221 22:02:35.999793 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knh79" event={"ID":"b0ef6389-987a-492e-8324-8b88a70f659f","Type":"ContainerStarted","Data":"73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f"} Feb 21 22:02:36 crc kubenswrapper[4717]: I0221 22:02:36.143453 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-knh79" podStartSLOduration=3.517591856 podStartE2EDuration="7.143435106s" podCreationTimestamp="2026-02-21 22:02:29 +0000 UTC" firstStartedPulling="2026-02-21 22:02:30.703600337 +0000 UTC m=+965.485133959" lastFinishedPulling="2026-02-21 22:02:34.329443587 +0000 UTC m=+969.110977209" observedRunningTime="2026-02-21 22:02:36.137611477 +0000 UTC m=+970.919145099" watchObservedRunningTime="2026-02-21 22:02:36.143435106 +0000 UTC m=+970.924968728" Feb 21 22:02:36 crc kubenswrapper[4717]: I0221 22:02:36.156270 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4ttdv" podStartSLOduration=3.318907797 podStartE2EDuration="35.156254223s" podCreationTimestamp="2026-02-21 22:02:01 +0000 UTC" firstStartedPulling="2026-02-21 22:02:02.65944089 +0000 UTC m=+937.440974512" lastFinishedPulling="2026-02-21 22:02:34.496787316 +0000 UTC m=+969.278320938" observedRunningTime="2026-02-21 22:02:36.151458889 +0000 UTC m=+970.932992511" watchObservedRunningTime="2026-02-21 22:02:36.156254223 +0000 UTC m=+970.937787845" Feb 21 22:02:37 crc kubenswrapper[4717]: I0221 22:02:37.010401 4717 generic.go:334] "Generic (PLEG): container finished" podID="f087f525-4007-4f12-b4e0-89e5d6b4eafb" containerID="fc5d9bc25940a3a0e67e4bd13fbf937a13b921f9b8084b93ebc9d58c63b41247" exitCode=0 Feb 21 22:02:37 crc kubenswrapper[4717]: I0221 22:02:37.010841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4946" event={"ID":"f087f525-4007-4f12-b4e0-89e5d6b4eafb","Type":"ContainerDied","Data":"fc5d9bc25940a3a0e67e4bd13fbf937a13b921f9b8084b93ebc9d58c63b41247"} Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.492592 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ff8b8ffdf-4dwxk"] Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.516474 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-669df94976-tmfpb"] Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.517783 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.521601 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.542764 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-669df94976-tmfpb"] Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.560526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-secret-key\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.560584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-combined-ca-bundle\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.560848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-tls-certs\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.561079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-scripts\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.561120 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-config-data\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.561143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-logs\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.561227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9z5\" (UniqueName: \"kubernetes.io/projected/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-kube-api-access-rs9z5\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.599498 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c6f587885-vvp7f"] Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.653667 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86b8dffbf6-mztpd"] Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.658266 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.662839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-combined-ca-bundle\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.662914 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-tls-certs\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.662966 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-scripts\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.662988 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-config-data\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.663008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-logs\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.663041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9z5\" (UniqueName: \"kubernetes.io/projected/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-kube-api-access-rs9z5\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.663084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-secret-key\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.664053 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-logs\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.664401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-scripts\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.664689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-config-data\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.666392 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86b8dffbf6-mztpd"] Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.671441 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-secret-key\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.672818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-combined-ca-bundle\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.693606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-tls-certs\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.703933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9z5\" (UniqueName: \"kubernetes.io/projected/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-kube-api-access-rs9z5\") pod \"horizon-669df94976-tmfpb\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bwl\" (UniqueName: \"kubernetes.io/projected/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-kube-api-access-78bwl\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764650 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-horizon-secret-key\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764676 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-config-data\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-logs\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764727 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-combined-ca-bundle\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-scripts\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.764807 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-horizon-tls-certs\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.844883 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.865648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-horizon-tls-certs\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.865744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78bwl\" (UniqueName: \"kubernetes.io/projected/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-kube-api-access-78bwl\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.865769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-horizon-secret-key\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.865790 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-config-data\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.865807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-logs\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.866435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-logs\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.866487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-combined-ca-bundle\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.866510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-scripts\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.867051 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-config-data\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.867155 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-scripts\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.870514 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-horizon-secret-key\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.871333 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-horizon-tls-certs\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.871781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-combined-ca-bundle\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:38 crc kubenswrapper[4717]: I0221 22:02:38.883058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bwl\" (UniqueName: \"kubernetes.io/projected/4e230efb-55a4-4e7f-9d9a-cc61d3123eab-kube-api-access-78bwl\") pod \"horizon-86b8dffbf6-mztpd\" (UID: \"4e230efb-55a4-4e7f-9d9a-cc61d3123eab\") " pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:39 crc kubenswrapper[4717]: I0221 22:02:39.042407 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:02:39 crc kubenswrapper[4717]: I0221 22:02:39.584670 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:39 crc kubenswrapper[4717]: I0221 22:02:39.584712 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:39 crc kubenswrapper[4717]: I0221 22:02:39.643254 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:40 crc kubenswrapper[4717]: I0221 22:02:40.103004 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:40 crc kubenswrapper[4717]: I0221 22:02:40.140250 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knh79"] Feb 21 22:02:40 crc kubenswrapper[4717]: I0221 22:02:40.912377 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:02:40 crc kubenswrapper[4717]: I0221 22:02:40.992960 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-djqlv"] Feb 21 22:02:40 crc kubenswrapper[4717]: I0221 22:02:40.993378 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" containerID="cri-o://7f2e64ff7f3927feef67353f92620a3d4ae8bbdf80e7870369ab49516feb1b1c" gracePeriod=10 Feb 21 22:02:41 crc kubenswrapper[4717]: I0221 22:02:41.942750 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 21 22:02:42 crc kubenswrapper[4717]: I0221 22:02:42.080628 4717 generic.go:334] "Generic (PLEG): container finished" podID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerID="7f2e64ff7f3927feef67353f92620a3d4ae8bbdf80e7870369ab49516feb1b1c" exitCode=0 Feb 21 22:02:42 crc kubenswrapper[4717]: I0221 22:02:42.080835 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-knh79" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="registry-server" containerID="cri-o://73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f" gracePeriod=2 Feb 21 22:02:42 crc kubenswrapper[4717]: I0221 22:02:42.080923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" event={"ID":"0351efa1-d14f-468d-ad6c-ea432ef629ba","Type":"ContainerDied","Data":"7f2e64ff7f3927feef67353f92620a3d4ae8bbdf80e7870369ab49516feb1b1c"} Feb 21 22:02:43 crc kubenswrapper[4717]: I0221 22:02:43.092417 4717 generic.go:334] "Generic (PLEG): container finished" podID="b0ef6389-987a-492e-8324-8b88a70f659f" containerID="73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f" exitCode=0 Feb 21 22:02:43 crc kubenswrapper[4717]: I0221 22:02:43.092467 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knh79" event={"ID":"b0ef6389-987a-492e-8324-8b88a70f659f","Type":"ContainerDied","Data":"73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f"} Feb 21 22:02:46 crc kubenswrapper[4717]: I0221 22:02:46.942770 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.132587 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4946" event={"ID":"f087f525-4007-4f12-b4e0-89e5d6b4eafb","Type":"ContainerDied","Data":"6acd987bf9cbda174bb61d50245721be3767fa94f3ab14bba6d6394c6650739e"} Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.133116 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6acd987bf9cbda174bb61d50245721be3767fa94f3ab14bba6d6394c6650739e" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.153488 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.219755 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-config-data\") pod \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.220005 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnltz\" (UniqueName: \"kubernetes.io/projected/f087f525-4007-4f12-b4e0-89e5d6b4eafb-kube-api-access-nnltz\") pod \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.220111 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-scripts\") pod \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.220157 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-credential-keys\") pod \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.220218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-fernet-keys\") pod \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.220251 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-combined-ca-bundle\") pod \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\" (UID: \"f087f525-4007-4f12-b4e0-89e5d6b4eafb\") " Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.228475 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f087f525-4007-4f12-b4e0-89e5d6b4eafb" (UID: "f087f525-4007-4f12-b4e0-89e5d6b4eafb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.228704 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-scripts" (OuterVolumeSpecName: "scripts") pod "f087f525-4007-4f12-b4e0-89e5d6b4eafb" (UID: "f087f525-4007-4f12-b4e0-89e5d6b4eafb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.229083 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f087f525-4007-4f12-b4e0-89e5d6b4eafb" (UID: "f087f525-4007-4f12-b4e0-89e5d6b4eafb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.231484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f087f525-4007-4f12-b4e0-89e5d6b4eafb-kube-api-access-nnltz" (OuterVolumeSpecName: "kube-api-access-nnltz") pod "f087f525-4007-4f12-b4e0-89e5d6b4eafb" (UID: "f087f525-4007-4f12-b4e0-89e5d6b4eafb"). InnerVolumeSpecName "kube-api-access-nnltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.246233 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f087f525-4007-4f12-b4e0-89e5d6b4eafb" (UID: "f087f525-4007-4f12-b4e0-89e5d6b4eafb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.252265 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-config-data" (OuterVolumeSpecName: "config-data") pod "f087f525-4007-4f12-b4e0-89e5d6b4eafb" (UID: "f087f525-4007-4f12-b4e0-89e5d6b4eafb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.321753 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.321830 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.321846 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.321953 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.321966 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087f525-4007-4f12-b4e0-89e5d6b4eafb-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:47 crc kubenswrapper[4717]: I0221 22:02:47.322001 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnltz\" (UniqueName: \"kubernetes.io/projected/f087f525-4007-4f12-b4e0-89e5d6b4eafb-kube-api-access-nnltz\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:48 crc kubenswrapper[4717]: E0221 22:02:48.138849 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 21 22:02:48 crc kubenswrapper[4717]: E0221 22:02:48.140373 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n547h5c9h55bh699hffh54ch5d4h65bh59h57ch54dh67fh6ch694h68h669h7bh5bbhbbhfdh59bh576h66dh5f6h56fh544h57h5ddh569h68dh55dh588q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kq4cc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7ff6e9c3-c94b-43ec-bba0-6e180be99f9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.150766 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4946" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.309606 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k4946"] Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.321503 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k4946"] Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.410697 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rrpvp"] Feb 21 22:02:48 crc kubenswrapper[4717]: E0221 22:02:48.411116 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f087f525-4007-4f12-b4e0-89e5d6b4eafb" containerName="keystone-bootstrap" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.411140 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f087f525-4007-4f12-b4e0-89e5d6b4eafb" containerName="keystone-bootstrap" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.411287 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f087f525-4007-4f12-b4e0-89e5d6b4eafb" containerName="keystone-bootstrap" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.411814 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.414483 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.415061 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.415406 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.415525 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lfvb4" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.416235 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.428775 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rrpvp"] Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.452252 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-scripts\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.452305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-credential-keys\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.452342 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-config-data\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.452494 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-fernet-keys\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.452530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-combined-ca-bundle\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.452701 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8tm2\" (UniqueName: \"kubernetes.io/projected/a647fe81-8c83-4f0c-996b-1a71081700f0-kube-api-access-j8tm2\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.560276 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8tm2\" (UniqueName: \"kubernetes.io/projected/a647fe81-8c83-4f0c-996b-1a71081700f0-kube-api-access-j8tm2\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.561193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-scripts\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.561275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-credential-keys\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.561319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-config-data\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.561474 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-fernet-keys\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.561516 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-combined-ca-bundle\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.567010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-credential-keys\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.567176 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-config-data\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.567926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-combined-ca-bundle\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.571663 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-fernet-keys\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.572197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-scripts\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.578692 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8tm2\" (UniqueName: \"kubernetes.io/projected/a647fe81-8c83-4f0c-996b-1a71081700f0-kube-api-access-j8tm2\") pod \"keystone-bootstrap-rrpvp\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: I0221 22:02:48.731438 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:02:48 crc kubenswrapper[4717]: E0221 22:02:48.842384 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 21 22:02:48 crc kubenswrapper[4717]: E0221 22:02:48.842570 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2n6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-2h82m_openstack(a945001c-fdf1-4bda-8012-3df96d9781ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:02:48 crc kubenswrapper[4717]: E0221 22:02:48.843803 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-2h82m" podUID="a945001c-fdf1-4bda-8012-3df96d9781ce" Feb 21 22:02:49 crc kubenswrapper[4717]: E0221 22:02:49.161242 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-2h82m" podUID="a945001c-fdf1-4bda-8012-3df96d9781ce" Feb 21 22:02:49 crc kubenswrapper[4717]: E0221 22:02:49.585499 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f is running failed: container process not found" containerID="73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 22:02:49 crc kubenswrapper[4717]: E0221 22:02:49.586056 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f is running failed: container process not found" containerID="73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 22:02:49 crc kubenswrapper[4717]: E0221 22:02:49.586397 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f is running failed: container process not found" containerID="73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f" cmd=["grpc_health_probe","-addr=:50051"] Feb 21 22:02:49 crc kubenswrapper[4717]: E0221 22:02:49.586477 4717 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-knh79" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="registry-server" Feb 21 22:02:49 crc kubenswrapper[4717]: I0221 22:02:49.987096 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f087f525-4007-4f12-b4e0-89e5d6b4eafb" path="/var/lib/kubelet/pods/f087f525-4007-4f12-b4e0-89e5d6b4eafb/volumes" Feb 21 22:02:51 crc kubenswrapper[4717]: I0221 22:02:51.178339 4717 generic.go:334] "Generic (PLEG): container finished" podID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" containerID="138d13f715b4f6a86656eb38c67929a0c8ee1d111fc73febb47ebc73e54203fa" exitCode=0 Feb 21 22:02:51 crc kubenswrapper[4717]: I0221 22:02:51.178389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ttdv" event={"ID":"7995c515-84a0-44d3-82e8-99a2ab1fb7b2","Type":"ContainerDied","Data":"138d13f715b4f6a86656eb38c67929a0c8ee1d111fc73febb47ebc73e54203fa"} Feb 21 22:02:54 crc kubenswrapper[4717]: I0221 22:02:54.208308 4717 generic.go:334] "Generic (PLEG): container finished" podID="fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" containerID="1712bb4801ea35250b75861058b9239da153385e4b26f1deed32e14c620ce1cc" exitCode=0 Feb 21 22:02:54 crc kubenswrapper[4717]: I0221 22:02:54.208956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v94db" event={"ID":"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7","Type":"ContainerDied","Data":"1712bb4801ea35250b75861058b9239da153385e4b26f1deed32e14c620ce1cc"} Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.374531 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.387195 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.408541 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mfwg\" (UniqueName: \"kubernetes.io/projected/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-kube-api-access-2mfwg\") pod \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.408656 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-combined-ca-bundle\") pod \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.408711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-config\") pod \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\" (UID: \"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.408749 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bsd2\" (UniqueName: \"kubernetes.io/projected/b0ef6389-987a-492e-8324-8b88a70f659f-kube-api-access-5bsd2\") pod \"b0ef6389-987a-492e-8324-8b88a70f659f\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.408922 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-utilities\") pod \"b0ef6389-987a-492e-8324-8b88a70f659f\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.408961 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-catalog-content\") pod \"b0ef6389-987a-492e-8324-8b88a70f659f\" (UID: \"b0ef6389-987a-492e-8324-8b88a70f659f\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.416680 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-utilities" (OuterVolumeSpecName: "utilities") pod "b0ef6389-987a-492e-8324-8b88a70f659f" (UID: "b0ef6389-987a-492e-8324-8b88a70f659f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.419297 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ef6389-987a-492e-8324-8b88a70f659f-kube-api-access-5bsd2" (OuterVolumeSpecName: "kube-api-access-5bsd2") pod "b0ef6389-987a-492e-8324-8b88a70f659f" (UID: "b0ef6389-987a-492e-8324-8b88a70f659f"). InnerVolumeSpecName "kube-api-access-5bsd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.428488 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-kube-api-access-2mfwg" (OuterVolumeSpecName: "kube-api-access-2mfwg") pod "fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" (UID: "fcb19e52-5b9c-478e-9e86-cc5529c2a6d7"). InnerVolumeSpecName "kube-api-access-2mfwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.428851 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.433127 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.456353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-config" (OuterVolumeSpecName: "config") pod "fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" (UID: "fcb19e52-5b9c-478e-9e86-cc5529c2a6d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.477718 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" (UID: "fcb19e52-5b9c-478e-9e86-cc5529c2a6d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.513050 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.513113 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.513168 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bsd2\" (UniqueName: \"kubernetes.io/projected/b0ef6389-987a-492e-8324-8b88a70f659f-kube-api-access-5bsd2\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.513191 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.513247 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mfwg\" (UniqueName: \"kubernetes.io/projected/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7-kube-api-access-2mfwg\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.528438 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0ef6389-987a-492e-8324-8b88a70f659f" (UID: "b0ef6389-987a-492e-8324-8b88a70f659f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614464 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-db-sync-config-data\") pod \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614537 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-swift-storage-0\") pod \"0351efa1-d14f-468d-ad6c-ea432ef629ba\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz64r\" (UniqueName: \"kubernetes.io/projected/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-kube-api-access-jz64r\") pod \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-svc\") pod \"0351efa1-d14f-468d-ad6c-ea432ef629ba\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614752 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-nb\") pod \"0351efa1-d14f-468d-ad6c-ea432ef629ba\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-config-data\") pod \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnbbc\" (UniqueName: \"kubernetes.io/projected/0351efa1-d14f-468d-ad6c-ea432ef629ba-kube-api-access-bnbbc\") pod \"0351efa1-d14f-468d-ad6c-ea432ef629ba\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614902 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-combined-ca-bundle\") pod \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\" (UID: \"7995c515-84a0-44d3-82e8-99a2ab1fb7b2\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614930 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-config\") pod \"0351efa1-d14f-468d-ad6c-ea432ef629ba\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.614965 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-sb\") pod \"0351efa1-d14f-468d-ad6c-ea432ef629ba\" (UID: \"0351efa1-d14f-468d-ad6c-ea432ef629ba\") " Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.615313 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0ef6389-987a-492e-8324-8b88a70f659f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.623617 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7995c515-84a0-44d3-82e8-99a2ab1fb7b2" (UID: "7995c515-84a0-44d3-82e8-99a2ab1fb7b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.626100 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0351efa1-d14f-468d-ad6c-ea432ef629ba-kube-api-access-bnbbc" (OuterVolumeSpecName: "kube-api-access-bnbbc") pod "0351efa1-d14f-468d-ad6c-ea432ef629ba" (UID: "0351efa1-d14f-468d-ad6c-ea432ef629ba"). InnerVolumeSpecName "kube-api-access-bnbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.627513 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-kube-api-access-jz64r" (OuterVolumeSpecName: "kube-api-access-jz64r") pod "7995c515-84a0-44d3-82e8-99a2ab1fb7b2" (UID: "7995c515-84a0-44d3-82e8-99a2ab1fb7b2"). InnerVolumeSpecName "kube-api-access-jz64r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.647121 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7995c515-84a0-44d3-82e8-99a2ab1fb7b2" (UID: "7995c515-84a0-44d3-82e8-99a2ab1fb7b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.667799 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0351efa1-d14f-468d-ad6c-ea432ef629ba" (UID: "0351efa1-d14f-468d-ad6c-ea432ef629ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.667826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0351efa1-d14f-468d-ad6c-ea432ef629ba" (UID: "0351efa1-d14f-468d-ad6c-ea432ef629ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.670250 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0351efa1-d14f-468d-ad6c-ea432ef629ba" (UID: "0351efa1-d14f-468d-ad6c-ea432ef629ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.675016 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-config-data" (OuterVolumeSpecName: "config-data") pod "7995c515-84a0-44d3-82e8-99a2ab1fb7b2" (UID: "7995c515-84a0-44d3-82e8-99a2ab1fb7b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.677389 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-config" (OuterVolumeSpecName: "config") pod "0351efa1-d14f-468d-ad6c-ea432ef629ba" (UID: "0351efa1-d14f-468d-ad6c-ea432ef629ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.680226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0351efa1-d14f-468d-ad6c-ea432ef629ba" (UID: "0351efa1-d14f-468d-ad6c-ea432ef629ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.716779 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.716818 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz64r\" (UniqueName: \"kubernetes.io/projected/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-kube-api-access-jz64r\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.716831 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.716840 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.716850 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.716874 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnbbc\" (UniqueName: \"kubernetes.io/projected/0351efa1-d14f-468d-ad6c-ea432ef629ba-kube-api-access-bnbbc\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.717340 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.717368 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.717381 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0351efa1-d14f-468d-ad6c-ea432ef629ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.717392 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7995c515-84a0-44d3-82e8-99a2ab1fb7b2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.943399 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 21 22:02:56 crc kubenswrapper[4717]: I0221 22:02:56.943886 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.240253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" event={"ID":"0351efa1-d14f-468d-ad6c-ea432ef629ba","Type":"ContainerDied","Data":"4cc2c46d9884013fd156416f731d09efead3e20c3eef2be5a763e2ccc1d4e773"} Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.240309 4717 scope.go:117] "RemoveContainer" containerID="7f2e64ff7f3927feef67353f92620a3d4ae8bbdf80e7870369ab49516feb1b1c" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.240418 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-djqlv" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.249644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ttdv" event={"ID":"7995c515-84a0-44d3-82e8-99a2ab1fb7b2","Type":"ContainerDied","Data":"200e746d0a89678f59093fbd1064fc825984321f59efc9c5820768362e3096f3"} Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.249701 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="200e746d0a89678f59093fbd1064fc825984321f59efc9c5820768362e3096f3" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.249795 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ttdv" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.254609 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knh79" event={"ID":"b0ef6389-987a-492e-8324-8b88a70f659f","Type":"ContainerDied","Data":"2f976312bb0556b7c6a13ca90f3e55f9b57869a915bda10828b9e280adaee9bf"} Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.254707 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knh79" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.258087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v94db" event={"ID":"fcb19e52-5b9c-478e-9e86-cc5529c2a6d7","Type":"ContainerDied","Data":"76274386de511323bd2b8b330ff66a076bef4cf47c44dabdacc3ef3b89bee5fb"} Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.258160 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76274386de511323bd2b8b330ff66a076bef4cf47c44dabdacc3ef3b89bee5fb" Feb 21 22:02:57 crc kubenswrapper[4717]: I0221 22:02:57.258252 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v94db" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.083666 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knh79"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.090519 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-knh79"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.101920 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-djqlv"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.119725 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-djqlv"] Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.267833 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.268011 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f7jlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6xk4j_openstack(3727ff36-57dd-4c91-ab08-d5c87ee4e357): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.270468 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6xk4j" podUID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315171 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65965d6475-rzdqw"] Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315602 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="extract-content" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315634 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="extract-content" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315648 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" containerName="glance-db-sync" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315656 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" containerName="glance-db-sync" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315682 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" containerName="neutron-db-sync" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315690 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" containerName="neutron-db-sync" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315701 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="extract-utilities" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315709 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="extract-utilities" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315718 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315728 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315740 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="registry-server" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315748 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="registry-server" Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.315764 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="init" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.315772 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="init" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.316029 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" containerName="dnsmasq-dns" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.316052 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" containerName="neutron-db-sync" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.316067 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" containerName="glance-db-sync" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.316089 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" containerName="registry-server" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.317107 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.365393 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-rzdqw"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.412853 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8fbcb64b8-88w42"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.414607 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.421790 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-x55b2" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.422102 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.422303 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.422465 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.430852 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8fbcb64b8-88w42"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.438173 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-rzdqw"] Feb 21 22:02:58 crc kubenswrapper[4717]: E0221 22:02:58.439951 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-dktq8 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-65965d6475-rzdqw" podUID="e921fd28-285d-4acd-928b-71928964f61b" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-config\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457103 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457122 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-httpd-config\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-svc\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grnf4\" (UniqueName: \"kubernetes.io/projected/7dccedac-c29e-4ae2-bfac-d55b444cb715-kube-api-access-grnf4\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457247 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-ovndb-tls-certs\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-combined-ca-bundle\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457306 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktq8\" (UniqueName: \"kubernetes.io/projected/e921fd28-285d-4acd-928b-71928964f61b-kube-api-access-dktq8\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.457329 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-config\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.496805 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-hqhkj"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.498286 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.503842 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-hqhkj"] Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559057 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-config\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559144 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-httpd-config\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559194 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-svc\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559244 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559263 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grnf4\" (UniqueName: \"kubernetes.io/projected/7dccedac-c29e-4ae2-bfac-d55b444cb715-kube-api-access-grnf4\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559285 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559304 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvsjf\" (UniqueName: \"kubernetes.io/projected/92156922-8dc6-4d2f-9a66-92c8f049374c-kube-api-access-dvsjf\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559331 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-ovndb-tls-certs\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559348 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559368 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-combined-ca-bundle\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktq8\" (UniqueName: \"kubernetes.io/projected/e921fd28-285d-4acd-928b-71928964f61b-kube-api-access-dktq8\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559435 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-config\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559452 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-config\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.559487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.560275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.560820 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-config\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.561169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.562485 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.564386 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-svc\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.566403 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-ovndb-tls-certs\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.566490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-httpd-config\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.571278 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-combined-ca-bundle\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.577266 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-config\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.582674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grnf4\" (UniqueName: \"kubernetes.io/projected/7dccedac-c29e-4ae2-bfac-d55b444cb715-kube-api-access-grnf4\") pod \"neutron-8fbcb64b8-88w42\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.604060 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktq8\" (UniqueName: \"kubernetes.io/projected/e921fd28-285d-4acd-928b-71928964f61b-kube-api-access-dktq8\") pod \"dnsmasq-dns-65965d6475-rzdqw\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.661880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.661963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-config\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.662028 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.662068 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.662089 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.662114 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvsjf\" (UniqueName: \"kubernetes.io/projected/92156922-8dc6-4d2f-9a66-92c8f049374c-kube-api-access-dvsjf\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.662636 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.663095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.664448 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.664643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.664989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-config\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.680676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvsjf\" (UniqueName: \"kubernetes.io/projected/92156922-8dc6-4d2f-9a66-92c8f049374c-kube-api-access-dvsjf\") pod \"dnsmasq-dns-84b966f6c9-hqhkj\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.762782 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:02:58 crc kubenswrapper[4717]: I0221 22:02:58.844274 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.068042 4717 scope.go:117] "RemoveContainer" containerID="aa595f9a91865784c26f785813687011b447429c09f9e83f3f4a8ea06d205916" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.106123 4717 scope.go:117] "RemoveContainer" containerID="73487b4e55bb61b1c290d4b6d02950062d260fc32b463da204bf8cbe697e956f" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.173130 4717 scope.go:117] "RemoveContainer" containerID="843981a6151052a7bdd420116752c5f52aa9283a4ab543709c17e30d9c56bb80" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.249434 4717 scope.go:117] "RemoveContainer" containerID="7df5fcbeecc7602ad0212f057ba7b16afc2cfac946b7127d9d82fd27bafadfdd" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.278956 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.281208 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.293188 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.302251 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-pp2fz" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.302527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.309559 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.329198 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsxs4" event={"ID":"d38c89d0-4315-4d98-86bc-570662736bba","Type":"ContainerStarted","Data":"ef3cb12c3f3bed326a7e03e36648cd3104dcb58e7c559d206a74cd53af5fbee2"} Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.329318 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:59 crc kubenswrapper[4717]: E0221 22:02:59.332353 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6xk4j" podUID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.375291 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zsxs4" podStartSLOduration=4.642082558 podStartE2EDuration="29.375270301s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="2026-02-21 22:02:31.49742708 +0000 UTC m=+966.278960702" lastFinishedPulling="2026-02-21 22:02:56.230614783 +0000 UTC m=+991.012148445" observedRunningTime="2026-02-21 22:02:59.36853112 +0000 UTC m=+994.150064742" watchObservedRunningTime="2026-02-21 22:02:59.375270301 +0000 UTC m=+994.156803923" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376015 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-scripts\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376064 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-config-data\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376131 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-logs\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376181 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4wps\" (UniqueName: \"kubernetes.io/projected/ccdfe816-90de-4a26-bd0f-8151a34322c8-kube-api-access-l4wps\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.376276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.421043 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rrpvp"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.428272 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86b8dffbf6-mztpd"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.478642 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-config-data\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480048 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-logs\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480159 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4wps\" (UniqueName: \"kubernetes.io/projected/ccdfe816-90de-4a26-bd0f-8151a34322c8-kube-api-access-l4wps\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-scripts\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.480488 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.483796 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-logs\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.483928 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.488254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.488641 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-config-data\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.494635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-scripts\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.496815 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4wps\" (UniqueName: \"kubernetes.io/projected/ccdfe816-90de-4a26-bd0f-8151a34322c8-kube-api-access-l4wps\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.520318 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: W0221 22:02:59.547494 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e230efb_55a4_4e7f_9d9a_cc61d3123eab.slice/crio-efc76a82edf2af0b383e72455311ddbc368a9d88c6fae3ea6223fc29996de81a WatchSource:0}: Error finding container efc76a82edf2af0b383e72455311ddbc368a9d88c6fae3ea6223fc29996de81a: Status 404 returned error can't find the container with id efc76a82edf2af0b383e72455311ddbc368a9d88c6fae3ea6223fc29996de81a Feb 21 22:02:59 crc kubenswrapper[4717]: W0221 22:02:59.549960 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda647fe81_8c83_4f0c_996b_1a71081700f0.slice/crio-2c5f2f2ce15b03ec1bfdca97bb9da1c9a7706c167e9a2730baace91d820c5e4c WatchSource:0}: Error finding container 2c5f2f2ce15b03ec1bfdca97bb9da1c9a7706c167e9a2730baace91d820c5e4c: Status 404 returned error can't find the container with id 2c5f2f2ce15b03ec1bfdca97bb9da1c9a7706c167e9a2730baace91d820c5e4c Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.580009 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.582401 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.584226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.607122 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.624082 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.625453 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.636680 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-669df94976-tmfpb"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.703231 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8fbcb64b8-88w42"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.732571 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-hqhkj"] Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.791806 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-nb\") pod \"e921fd28-285d-4acd-928b-71928964f61b\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.791901 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-sb\") pod \"e921fd28-285d-4acd-928b-71928964f61b\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.791964 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-swift-storage-0\") pod \"e921fd28-285d-4acd-928b-71928964f61b\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792001 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-config\") pod \"e921fd28-285d-4acd-928b-71928964f61b\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792104 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-svc\") pod \"e921fd28-285d-4acd-928b-71928964f61b\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792138 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dktq8\" (UniqueName: \"kubernetes.io/projected/e921fd28-285d-4acd-928b-71928964f61b-kube-api-access-dktq8\") pod \"e921fd28-285d-4acd-928b-71928964f61b\" (UID: \"e921fd28-285d-4acd-928b-71928964f61b\") " Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-logs\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e921fd28-285d-4acd-928b-71928964f61b" (UID: "e921fd28-285d-4acd-928b-71928964f61b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792446 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792461 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqc6\" (UniqueName: \"kubernetes.io/projected/327110cc-7c97-45f0-9a06-433a0cd67d0d-kube-api-access-vrqc6\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.792589 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.793262 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e921fd28-285d-4acd-928b-71928964f61b" (UID: "e921fd28-285d-4acd-928b-71928964f61b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.793626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-config" (OuterVolumeSpecName: "config") pod "e921fd28-285d-4acd-928b-71928964f61b" (UID: "e921fd28-285d-4acd-928b-71928964f61b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.793682 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e921fd28-285d-4acd-928b-71928964f61b" (UID: "e921fd28-285d-4acd-928b-71928964f61b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.794070 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e921fd28-285d-4acd-928b-71928964f61b" (UID: "e921fd28-285d-4acd-928b-71928964f61b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.805247 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e921fd28-285d-4acd-928b-71928964f61b-kube-api-access-dktq8" (OuterVolumeSpecName: "kube-api-access-dktq8") pod "e921fd28-285d-4acd-928b-71928964f61b" (UID: "e921fd28-285d-4acd-928b-71928964f61b"). InnerVolumeSpecName "kube-api-access-dktq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.895498 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.903649 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.904233 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-logs\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.904373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906013 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqc6\" (UniqueName: \"kubernetes.io/projected/327110cc-7c97-45f0-9a06-433a0cd67d0d-kube-api-access-vrqc6\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906700 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906721 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dktq8\" (UniqueName: \"kubernetes.io/projected/e921fd28-285d-4acd-928b-71928964f61b-kube-api-access-dktq8\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906737 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906750 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.906763 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e921fd28-285d-4acd-928b-71928964f61b-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.913417 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.913628 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-logs\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.916020 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.924180 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.940034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:02:59 crc kubenswrapper[4717]: I0221 22:02:59.940524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqc6\" (UniqueName: \"kubernetes.io/projected/327110cc-7c97-45f0-9a06-433a0cd67d0d-kube-api-access-vrqc6\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.010900 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0351efa1-d14f-468d-ad6c-ea432ef629ba" path="/var/lib/kubelet/pods/0351efa1-d14f-468d-ad6c-ea432ef629ba/volumes" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.011650 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ef6389-987a-492e-8324-8b88a70f659f" path="/var/lib/kubelet/pods/b0ef6389-987a-492e-8324-8b88a70f659f/volumes" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.015634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.255530 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.309387 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.419417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccdfe816-90de-4a26-bd0f-8151a34322c8","Type":"ContainerStarted","Data":"3db772312fbe0c82d61b527bb8d9036acd220d3c2ab74ce41d45cdca0231359c"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.443646 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrpvp" event={"ID":"a647fe81-8c83-4f0c-996b-1a71081700f0","Type":"ContainerStarted","Data":"c8c590bffc5fc340f5d8a8a7611c6d7b82c628ffa8a5d49cfd7f3d19a1152bb0"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.443702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrpvp" event={"ID":"a647fe81-8c83-4f0c-996b-1a71081700f0","Type":"ContainerStarted","Data":"2c5f2f2ce15b03ec1bfdca97bb9da1c9a7706c167e9a2730baace91d820c5e4c"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.460709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerStarted","Data":"7e423e308e430f4b5eb77d84609197685709348b1f5cd9bb210c457c78f89114"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.466114 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rrpvp" podStartSLOduration=12.466083961 podStartE2EDuration="12.466083961s" podCreationTimestamp="2026-02-21 22:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:00.464489263 +0000 UTC m=+995.246022885" watchObservedRunningTime="2026-02-21 22:03:00.466083961 +0000 UTC m=+995.247617583" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.488131 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5d497dbf-bkvz6" event={"ID":"09f86cea-494d-4f11-b9f5-2045f7aabd92","Type":"ContainerStarted","Data":"208618fb7dba17474af74671a8910f21495f0c58490ed4c913725f2d1bc0de38"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.488215 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b5d497dbf-bkvz6" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon-log" containerID="cri-o://208618fb7dba17474af74671a8910f21495f0c58490ed4c913725f2d1bc0de38" gracePeriod=30 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.488367 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b5d497dbf-bkvz6" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon" containerID="cri-o://e150b7b66dcbe638973e20a725f834dd974991b8313f7237c6399a518093e9e7" gracePeriod=30 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.500619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8fbcb64b8-88w42" event={"ID":"7dccedac-c29e-4ae2-bfac-d55b444cb715","Type":"ContainerStarted","Data":"cd9cdcb54d005627316cfa56c267d1c64c79db62fb498db101e2f032a6ec6998"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.500674 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8fbcb64b8-88w42" event={"ID":"7dccedac-c29e-4ae2-bfac-d55b444cb715","Type":"ContainerStarted","Data":"4c0e2534efe08bea1bd840cb0bc7a32c8d7b2606165a5b5e6dc3be7b59fe0a6f"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.505323 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6f587885-vvp7f" event={"ID":"ea65478f-2244-4670-9821-526d00eb1b9a","Type":"ContainerStarted","Data":"3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.505370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6f587885-vvp7f" event={"ID":"ea65478f-2244-4670-9821-526d00eb1b9a","Type":"ContainerStarted","Data":"e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.505499 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c6f587885-vvp7f" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon-log" containerID="cri-o://e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7" gracePeriod=30 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.505743 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c6f587885-vvp7f" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon" containerID="cri-o://3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74" gracePeriod=30 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.518182 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-669df94976-tmfpb" event={"ID":"7af1cf64-7044-4170-9ba4-bcc17d97cbb2","Type":"ContainerStarted","Data":"3e81fcad833c7e7a7f433096a45c795ab83d621d5970448361c95c6961841ed0"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.518224 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-669df94976-tmfpb" event={"ID":"7af1cf64-7044-4170-9ba4-bcc17d97cbb2","Type":"ContainerStarted","Data":"6b2c175bec57cb55b86d9f9bc56882730c4439ae1be8aeb645dfee9739677a7a"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.519761 4717 generic.go:334] "Generic (PLEG): container finished" podID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerID="32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f" exitCode=0 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.519803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" event={"ID":"92156922-8dc6-4d2f-9a66-92c8f049374c","Type":"ContainerDied","Data":"32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.519817 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" event={"ID":"92156922-8dc6-4d2f-9a66-92c8f049374c","Type":"ContainerStarted","Data":"97d747abb4485011abdc6d38de9ff90cef2778ae2a730e8b0b8b853037b830ec"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.528764 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8dffbf6-mztpd" event={"ID":"4e230efb-55a4-4e7f-9d9a-cc61d3123eab","Type":"ContainerStarted","Data":"0de1bc2b4ad9a796b9759b0f485bd466a7813afbb79e20e2babc9798f09489c6"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.528821 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8dffbf6-mztpd" event={"ID":"4e230efb-55a4-4e7f-9d9a-cc61d3123eab","Type":"ContainerStarted","Data":"3b0f48298c95e390a2b9ff8ed6fe4de4dbce1f50389eb27d4666640a8cdc00c3"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.528834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8dffbf6-mztpd" event={"ID":"4e230efb-55a4-4e7f-9d9a-cc61d3123eab","Type":"ContainerStarted","Data":"efc76a82edf2af0b383e72455311ddbc368a9d88c6fae3ea6223fc29996de81a"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.549374 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c6f587885-vvp7f" podStartSLOduration=3.517350916 podStartE2EDuration="27.549355076s" podCreationTimestamp="2026-02-21 22:02:33 +0000 UTC" firstStartedPulling="2026-02-21 22:02:34.185079216 +0000 UTC m=+968.966612838" lastFinishedPulling="2026-02-21 22:02:58.217083376 +0000 UTC m=+992.998616998" observedRunningTime="2026-02-21 22:03:00.544577882 +0000 UTC m=+995.326111504" watchObservedRunningTime="2026-02-21 22:03:00.549355076 +0000 UTC m=+995.330888698" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.552298 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-rzdqw" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.553122 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7ff8b8ffdf-4dwxk" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon-log" containerID="cri-o://8543a25dbf077d192eea9219229f62802d037cf29feaa2444a1507cb3cc6e9e7" gracePeriod=30 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.553391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ff8b8ffdf-4dwxk" event={"ID":"439eab0e-0489-4a97-993e-c6c3df03e694","Type":"ContainerStarted","Data":"86c54231ad6bf38342e808fd0a76f35e200e9ceaff8aa1c46db4a352090a3b06"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.553418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ff8b8ffdf-4dwxk" event={"ID":"439eab0e-0489-4a97-993e-c6c3df03e694","Type":"ContainerStarted","Data":"8543a25dbf077d192eea9219229f62802d037cf29feaa2444a1507cb3cc6e9e7"} Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.553429 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7ff8b8ffdf-4dwxk" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon" containerID="cri-o://86c54231ad6bf38342e808fd0a76f35e200e9ceaff8aa1c46db4a352090a3b06" gracePeriod=30 Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.559944 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b5d497dbf-bkvz6" podStartSLOduration=2.755945369 podStartE2EDuration="30.55992724s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="2026-02-21 22:02:31.196459178 +0000 UTC m=+965.977992790" lastFinishedPulling="2026-02-21 22:02:59.000441029 +0000 UTC m=+993.781974661" observedRunningTime="2026-02-21 22:03:00.517035262 +0000 UTC m=+995.298568884" watchObservedRunningTime="2026-02-21 22:03:00.55992724 +0000 UTC m=+995.341460862" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.618520 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86b8dffbf6-mztpd" podStartSLOduration=22.618502544000002 podStartE2EDuration="22.618502544s" podCreationTimestamp="2026-02-21 22:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:00.608430302 +0000 UTC m=+995.389963924" watchObservedRunningTime="2026-02-21 22:03:00.618502544 +0000 UTC m=+995.400036166" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.688285 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-rzdqw"] Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.696896 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7ff8b8ffdf-4dwxk" podStartSLOduration=3.82585122 podStartE2EDuration="30.696875972s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="2026-02-21 22:02:32.094915899 +0000 UTC m=+966.876449521" lastFinishedPulling="2026-02-21 22:02:58.965940651 +0000 UTC m=+993.747474273" observedRunningTime="2026-02-21 22:03:00.680334046 +0000 UTC m=+995.461867668" watchObservedRunningTime="2026-02-21 22:03:00.696875972 +0000 UTC m=+995.478409594" Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.697529 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-rzdqw"] Feb 21 22:03:00 crc kubenswrapper[4717]: I0221 22:03:00.976916 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.138525 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.574890 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccdfe816-90de-4a26-bd0f-8151a34322c8","Type":"ContainerStarted","Data":"4ebc6437b28c3f7c0d315587afb1d9c4213ee33e77b4215e5a932e6dd77aa790"} Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.601313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-669df94976-tmfpb" event={"ID":"7af1cf64-7044-4170-9ba4-bcc17d97cbb2","Type":"ContainerStarted","Data":"81ea984b703415ccda6d6772c5e00a9f9a88a1f920ae9fd881bd21238a2548b2"} Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.612787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" event={"ID":"92156922-8dc6-4d2f-9a66-92c8f049374c","Type":"ContainerStarted","Data":"43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed"} Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.613054 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.642224 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-669df94976-tmfpb" podStartSLOduration=23.642207486 podStartE2EDuration="23.642207486s" podCreationTimestamp="2026-02-21 22:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:01.626477388 +0000 UTC m=+996.408011030" watchObservedRunningTime="2026-02-21 22:03:01.642207486 +0000 UTC m=+996.423741108" Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.645681 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5d497dbf-bkvz6" event={"ID":"09f86cea-494d-4f11-b9f5-2045f7aabd92","Type":"ContainerStarted","Data":"e150b7b66dcbe638973e20a725f834dd974991b8313f7237c6399a518093e9e7"} Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.651264 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"327110cc-7c97-45f0-9a06-433a0cd67d0d","Type":"ContainerStarted","Data":"712828d495d6e291fe4b774a466c46d4423983fbdcee665b39e1bb70ffeceeb6"} Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.652332 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" podStartSLOduration=3.652312968 podStartE2EDuration="3.652312968s" podCreationTimestamp="2026-02-21 22:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:01.648996139 +0000 UTC m=+996.430529761" watchObservedRunningTime="2026-02-21 22:03:01.652312968 +0000 UTC m=+996.433846590" Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.692132 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8fbcb64b8-88w42" event={"ID":"7dccedac-c29e-4ae2-bfac-d55b444cb715","Type":"ContainerStarted","Data":"42c29a25ab668a8d51496582563028daa1de047fe727efa08fd3504144a43836"} Feb 21 22:03:01 crc kubenswrapper[4717]: I0221 22:03:01.692539 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.001012 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8fbcb64b8-88w42" podStartSLOduration=4.000990273 podStartE2EDuration="4.000990273s" podCreationTimestamp="2026-02-21 22:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:01.710772709 +0000 UTC m=+996.492306331" watchObservedRunningTime="2026-02-21 22:03:02.000990273 +0000 UTC m=+996.782523885" Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.003597 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e921fd28-285d-4acd-928b-71928964f61b" path="/var/lib/kubelet/pods/e921fd28-285d-4acd-928b-71928964f61b/volumes" Feb 21 22:03:02 crc kubenswrapper[4717]: E0221 22:03:02.609001 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd38c89d0_4315_4d98_86bc_570662736bba.slice/crio-conmon-ef3cb12c3f3bed326a7e03e36648cd3104dcb58e7c559d206a74cd53af5fbee2.scope\": RecentStats: unable to find data in memory cache]" Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.695501 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.734981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccdfe816-90de-4a26-bd0f-8151a34322c8","Type":"ContainerStarted","Data":"f21d53979e1dc367da643ec51fb16005e66be2c4c14df1e76d83f3ae0b301dcd"} Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.753292 4717 generic.go:334] "Generic (PLEG): container finished" podID="d38c89d0-4315-4d98-86bc-570662736bba" containerID="ef3cb12c3f3bed326a7e03e36648cd3104dcb58e7c559d206a74cd53af5fbee2" exitCode=0 Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.753414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsxs4" event={"ID":"d38c89d0-4315-4d98-86bc-570662736bba","Type":"ContainerDied","Data":"ef3cb12c3f3bed326a7e03e36648cd3104dcb58e7c559d206a74cd53af5fbee2"} Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.765919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"327110cc-7c97-45f0-9a06-433a0cd67d0d","Type":"ContainerStarted","Data":"bab516f922ba492d23ddaa1702d21dd69e30a88a25e38e086d0c0d25b510d562"} Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.771667 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.771652681 podStartE2EDuration="4.771652681s" podCreationTimestamp="2026-02-21 22:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:02.76238867 +0000 UTC m=+997.543922292" watchObservedRunningTime="2026-02-21 22:03:02.771652681 +0000 UTC m=+997.553186303" Feb 21 22:03:02 crc kubenswrapper[4717]: I0221 22:03:02.802053 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.601204 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.778804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"327110cc-7c97-45f0-9a06-433a0cd67d0d","Type":"ContainerStarted","Data":"fb925d246e20ca659a2607767adc29e3624b48567664a449b10bc0256ab85cf7"} Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.778870 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-log" containerID="cri-o://4ebc6437b28c3f7c0d315587afb1d9c4213ee33e77b4215e5a932e6dd77aa790" gracePeriod=30 Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.778995 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-httpd" containerID="cri-o://f21d53979e1dc367da643ec51fb16005e66be2c4c14df1e76d83f3ae0b301dcd" gracePeriod=30 Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.779190 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-httpd" containerID="cri-o://fb925d246e20ca659a2607767adc29e3624b48567664a449b10bc0256ab85cf7" gracePeriod=30 Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.779136 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-log" containerID="cri-o://bab516f922ba492d23ddaa1702d21dd69e30a88a25e38e086d0c0d25b510d562" gracePeriod=30 Feb 21 22:03:03 crc kubenswrapper[4717]: I0221 22:03:03.813109 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.813093648 podStartE2EDuration="5.813093648s" podCreationTimestamp="2026-02-21 22:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:03.812827402 +0000 UTC m=+998.594361024" watchObservedRunningTime="2026-02-21 22:03:03.813093648 +0000 UTC m=+998.594627270" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.233599 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsxs4" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.324348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-config-data\") pod \"d38c89d0-4315-4d98-86bc-570662736bba\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.324422 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-combined-ca-bundle\") pod \"d38c89d0-4315-4d98-86bc-570662736bba\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.324510 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p2tz\" (UniqueName: \"kubernetes.io/projected/d38c89d0-4315-4d98-86bc-570662736bba-kube-api-access-6p2tz\") pod \"d38c89d0-4315-4d98-86bc-570662736bba\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.324580 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-scripts\") pod \"d38c89d0-4315-4d98-86bc-570662736bba\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.325004 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38c89d0-4315-4d98-86bc-570662736bba-logs\") pod \"d38c89d0-4315-4d98-86bc-570662736bba\" (UID: \"d38c89d0-4315-4d98-86bc-570662736bba\") " Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.326283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38c89d0-4315-4d98-86bc-570662736bba-logs" (OuterVolumeSpecName: "logs") pod "d38c89d0-4315-4d98-86bc-570662736bba" (UID: "d38c89d0-4315-4d98-86bc-570662736bba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.330321 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38c89d0-4315-4d98-86bc-570662736bba-kube-api-access-6p2tz" (OuterVolumeSpecName: "kube-api-access-6p2tz") pod "d38c89d0-4315-4d98-86bc-570662736bba" (UID: "d38c89d0-4315-4d98-86bc-570662736bba"). InnerVolumeSpecName "kube-api-access-6p2tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.330574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-scripts" (OuterVolumeSpecName: "scripts") pod "d38c89d0-4315-4d98-86bc-570662736bba" (UID: "d38c89d0-4315-4d98-86bc-570662736bba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.350223 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d38c89d0-4315-4d98-86bc-570662736bba" (UID: "d38c89d0-4315-4d98-86bc-570662736bba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.351968 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-config-data" (OuterVolumeSpecName: "config-data") pod "d38c89d0-4315-4d98-86bc-570662736bba" (UID: "d38c89d0-4315-4d98-86bc-570662736bba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.429655 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.429684 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.429698 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p2tz\" (UniqueName: \"kubernetes.io/projected/d38c89d0-4315-4d98-86bc-570662736bba-kube-api-access-6p2tz\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.429707 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d38c89d0-4315-4d98-86bc-570662736bba-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.429716 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d38c89d0-4315-4d98-86bc-570662736bba-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.762966 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7cc86c585-7kk6w"] Feb 21 22:03:04 crc kubenswrapper[4717]: E0221 22:03:04.763325 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38c89d0-4315-4d98-86bc-570662736bba" containerName="placement-db-sync" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.763342 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38c89d0-4315-4d98-86bc-570662736bba" containerName="placement-db-sync" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.763534 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38c89d0-4315-4d98-86bc-570662736bba" containerName="placement-db-sync" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.764351 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.772305 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.772529 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.782981 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cc86c585-7kk6w"] Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.798008 4717 generic.go:334] "Generic (PLEG): container finished" podID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerID="f21d53979e1dc367da643ec51fb16005e66be2c4c14df1e76d83f3ae0b301dcd" exitCode=0 Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.798039 4717 generic.go:334] "Generic (PLEG): container finished" podID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerID="4ebc6437b28c3f7c0d315587afb1d9c4213ee33e77b4215e5a932e6dd77aa790" exitCode=143 Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.798078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccdfe816-90de-4a26-bd0f-8151a34322c8","Type":"ContainerDied","Data":"f21d53979e1dc367da643ec51fb16005e66be2c4c14df1e76d83f3ae0b301dcd"} Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.798105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccdfe816-90de-4a26-bd0f-8151a34322c8","Type":"ContainerDied","Data":"4ebc6437b28c3f7c0d315587afb1d9c4213ee33e77b4215e5a932e6dd77aa790"} Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.816104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsxs4" event={"ID":"d38c89d0-4315-4d98-86bc-570662736bba","Type":"ContainerDied","Data":"977a4e01fcc164859fc4446b4123b574c70a1efb170e5a3ad793cf4a6f1391e4"} Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.816141 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977a4e01fcc164859fc4446b4123b574c70a1efb170e5a3ad793cf4a6f1391e4" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.816206 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsxs4" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.831031 4717 generic.go:334] "Generic (PLEG): container finished" podID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerID="fb925d246e20ca659a2607767adc29e3624b48567664a449b10bc0256ab85cf7" exitCode=0 Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.831176 4717 generic.go:334] "Generic (PLEG): container finished" podID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerID="bab516f922ba492d23ddaa1702d21dd69e30a88a25e38e086d0c0d25b510d562" exitCode=143 Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.831250 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"327110cc-7c97-45f0-9a06-433a0cd67d0d","Type":"ContainerDied","Data":"fb925d246e20ca659a2607767adc29e3624b48567664a449b10bc0256ab85cf7"} Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.831323 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"327110cc-7c97-45f0-9a06-433a0cd67d0d","Type":"ContainerDied","Data":"bab516f922ba492d23ddaa1702d21dd69e30a88a25e38e086d0c0d25b510d562"} Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-ovndb-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938611 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59sbb\" (UniqueName: \"kubernetes.io/projected/acdc8b18-646d-4f3d-8c30-9e80d7b78058-kube-api-access-59sbb\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938637 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-public-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938686 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-httpd-config\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-combined-ca-bundle\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938826 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-internal-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.938847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-config\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.963252 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f5656cf84-5gfzr"] Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.964981 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.973652 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.973907 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.973944 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.974146 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6476r" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.981941 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 21 22:03:04 crc kubenswrapper[4717]: I0221 22:03:04.993804 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f5656cf84-5gfzr"] Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063294 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-internal-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-config\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063567 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-ovndb-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59sbb\" (UniqueName: \"kubernetes.io/projected/acdc8b18-646d-4f3d-8c30-9e80d7b78058-kube-api-access-59sbb\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-public-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-httpd-config\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.063982 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-combined-ca-bundle\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.072940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-combined-ca-bundle\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.079881 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-httpd-config\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.080616 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-public-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.080630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-internal-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.082303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-config\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.096849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-ovndb-tls-certs\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.102436 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59sbb\" (UniqueName: \"kubernetes.io/projected/acdc8b18-646d-4f3d-8c30-9e80d7b78058-kube-api-access-59sbb\") pod \"neutron-7cc86c585-7kk6w\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166008 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-internal-tls-certs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-logs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-scripts\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8twsp\" (UniqueName: \"kubernetes.io/projected/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-kube-api-access-8twsp\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166176 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-public-tls-certs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166253 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-combined-ca-bundle\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.166277 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-config-data\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.267738 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-internal-tls-certs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-logs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-scripts\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8twsp\" (UniqueName: \"kubernetes.io/projected/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-kube-api-access-8twsp\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-public-tls-certs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-combined-ca-bundle\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-config-data\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.268518 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-logs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.278875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-public-tls-certs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.279285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-scripts\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.280564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-combined-ca-bundle\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.284421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-internal-tls-certs\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.288332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-config-data\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.293340 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8twsp\" (UniqueName: \"kubernetes.io/projected/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-kube-api-access-8twsp\") pod \"placement-7f5656cf84-5gfzr\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.294270 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.389026 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.807359 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f5656cf84-5gfzr"] Feb 21 22:03:05 crc kubenswrapper[4717]: I0221 22:03:05.841948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5656cf84-5gfzr" event={"ID":"18ff72f4-66a6-4e32-aac7-7f55e600a1ae","Type":"ContainerStarted","Data":"505b23ed2e99f35dac5726c71b7cd354c86e1f62bbd4cb2b1c125530378edebf"} Feb 21 22:03:06 crc kubenswrapper[4717]: I0221 22:03:06.097412 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cc86c585-7kk6w"] Feb 21 22:03:06 crc kubenswrapper[4717]: I0221 22:03:06.850049 4717 generic.go:334] "Generic (PLEG): container finished" podID="a647fe81-8c83-4f0c-996b-1a71081700f0" containerID="c8c590bffc5fc340f5d8a8a7611c6d7b82c628ffa8a5d49cfd7f3d19a1152bb0" exitCode=0 Feb 21 22:03:06 crc kubenswrapper[4717]: I0221 22:03:06.850087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrpvp" event={"ID":"a647fe81-8c83-4f0c-996b-1a71081700f0","Type":"ContainerDied","Data":"c8c590bffc5fc340f5d8a8a7611c6d7b82c628ffa8a5d49cfd7f3d19a1152bb0"} Feb 21 22:03:08 crc kubenswrapper[4717]: I0221 22:03:08.845742 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:03:08 crc kubenswrapper[4717]: I0221 22:03:08.846075 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:03:08 crc kubenswrapper[4717]: I0221 22:03:08.846088 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:03:08 crc kubenswrapper[4717]: I0221 22:03:08.936708 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mp7ws"] Feb 21 22:03:08 crc kubenswrapper[4717]: I0221 22:03:08.937609 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerName="dnsmasq-dns" containerID="cri-o://2c9ed442ca6b3107ad9b7938dd2a39e91b15d48bb914f9df805405565a093e25" gracePeriod=10 Feb 21 22:03:09 crc kubenswrapper[4717]: I0221 22:03:09.042946 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:03:09 crc kubenswrapper[4717]: I0221 22:03:09.043315 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:03:09 crc kubenswrapper[4717]: I0221 22:03:09.905614 4717 generic.go:334] "Generic (PLEG): container finished" podID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerID="2c9ed442ca6b3107ad9b7938dd2a39e91b15d48bb914f9df805405565a093e25" exitCode=0 Feb 21 22:03:09 crc kubenswrapper[4717]: I0221 22:03:09.905691 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" event={"ID":"deeb3ad3-4fe8-4faf-9307-5da9988002f6","Type":"ContainerDied","Data":"2c9ed442ca6b3107ad9b7938dd2a39e91b15d48bb914f9df805405565a093e25"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.371096 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.448437 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.500290 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.542185 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.545651 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.585332 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-combined-ca-bundle\") pod \"a647fe81-8c83-4f0c-996b-1a71081700f0\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.585377 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-credential-keys\") pod \"a647fe81-8c83-4f0c-996b-1a71081700f0\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.585401 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-fernet-keys\") pod \"a647fe81-8c83-4f0c-996b-1a71081700f0\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.585450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8tm2\" (UniqueName: \"kubernetes.io/projected/a647fe81-8c83-4f0c-996b-1a71081700f0-kube-api-access-j8tm2\") pod \"a647fe81-8c83-4f0c-996b-1a71081700f0\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.585467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-scripts\") pod \"a647fe81-8c83-4f0c-996b-1a71081700f0\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.585571 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-config-data\") pod \"a647fe81-8c83-4f0c-996b-1a71081700f0\" (UID: \"a647fe81-8c83-4f0c-996b-1a71081700f0\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.596953 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a647fe81-8c83-4f0c-996b-1a71081700f0" (UID: "a647fe81-8c83-4f0c-996b-1a71081700f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.598409 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-scripts" (OuterVolumeSpecName: "scripts") pod "a647fe81-8c83-4f0c-996b-1a71081700f0" (UID: "a647fe81-8c83-4f0c-996b-1a71081700f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.599055 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a647fe81-8c83-4f0c-996b-1a71081700f0-kube-api-access-j8tm2" (OuterVolumeSpecName: "kube-api-access-j8tm2") pod "a647fe81-8c83-4f0c-996b-1a71081700f0" (UID: "a647fe81-8c83-4f0c-996b-1a71081700f0"). InnerVolumeSpecName "kube-api-access-j8tm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.601488 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a647fe81-8c83-4f0c-996b-1a71081700f0" (UID: "a647fe81-8c83-4f0c-996b-1a71081700f0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.645683 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a647fe81-8c83-4f0c-996b-1a71081700f0" (UID: "a647fe81-8c83-4f0c-996b-1a71081700f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.659635 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-config-data" (OuterVolumeSpecName: "config-data") pod "a647fe81-8c83-4f0c-996b-1a71081700f0" (UID: "a647fe81-8c83-4f0c-996b-1a71081700f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-swift-storage-0\") pod \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686766 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-svc\") pod \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686787 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686806 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-nb\") pod \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686840 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jql8m\" (UniqueName: \"kubernetes.io/projected/deeb3ad3-4fe8-4faf-9307-5da9988002f6-kube-api-access-jql8m\") pod \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686906 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-httpd-run\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686922 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-sb\") pod \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.686941 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-config-data\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687011 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-combined-ca-bundle\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687040 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-config-data\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687058 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687099 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-logs\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687119 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-httpd-run\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687152 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4wps\" (UniqueName: \"kubernetes.io/projected/ccdfe816-90de-4a26-bd0f-8151a34322c8-kube-api-access-l4wps\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-logs\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687200 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-scripts\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrqc6\" (UniqueName: \"kubernetes.io/projected/327110cc-7c97-45f0-9a06-433a0cd67d0d-kube-api-access-vrqc6\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687257 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-scripts\") pod \"327110cc-7c97-45f0-9a06-433a0cd67d0d\" (UID: \"327110cc-7c97-45f0-9a06-433a0cd67d0d\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687299 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-combined-ca-bundle\") pod \"ccdfe816-90de-4a26-bd0f-8151a34322c8\" (UID: \"ccdfe816-90de-4a26-bd0f-8151a34322c8\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-config\") pod \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\" (UID: \"deeb3ad3-4fe8-4faf-9307-5da9988002f6\") " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687628 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8tm2\" (UniqueName: \"kubernetes.io/projected/a647fe81-8c83-4f0c-996b-1a71081700f0-kube-api-access-j8tm2\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687644 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687652 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687661 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687669 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.687677 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a647fe81-8c83-4f0c-996b-1a71081700f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.691443 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.694235 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-logs" (OuterVolumeSpecName: "logs") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.694463 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.697538 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-logs" (OuterVolumeSpecName: "logs") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.703045 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deeb3ad3-4fe8-4faf-9307-5da9988002f6-kube-api-access-jql8m" (OuterVolumeSpecName: "kube-api-access-jql8m") pod "deeb3ad3-4fe8-4faf-9307-5da9988002f6" (UID: "deeb3ad3-4fe8-4faf-9307-5da9988002f6"). InnerVolumeSpecName "kube-api-access-jql8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.703353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-scripts" (OuterVolumeSpecName: "scripts") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.706233 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.706316 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccdfe816-90de-4a26-bd0f-8151a34322c8-kube-api-access-l4wps" (OuterVolumeSpecName: "kube-api-access-l4wps") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "kube-api-access-l4wps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.706344 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.711996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-scripts" (OuterVolumeSpecName: "scripts") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.717033 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327110cc-7c97-45f0-9a06-433a0cd67d0d-kube-api-access-vrqc6" (OuterVolumeSpecName: "kube-api-access-vrqc6") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "kube-api-access-vrqc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789474 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789506 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrqc6\" (UniqueName: \"kubernetes.io/projected/327110cc-7c97-45f0-9a06-433a0cd67d0d-kube-api-access-vrqc6\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789520 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789549 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789562 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jql8m\" (UniqueName: \"kubernetes.io/projected/deeb3ad3-4fe8-4faf-9307-5da9988002f6-kube-api-access-jql8m\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789576 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789593 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789606 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccdfe816-90de-4a26-bd0f-8151a34322c8-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789617 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789630 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4wps\" (UniqueName: \"kubernetes.io/projected/ccdfe816-90de-4a26-bd0f-8151a34322c8-kube-api-access-l4wps\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.789641 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327110cc-7c97-45f0-9a06-433a0cd67d0d-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.790039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-config" (OuterVolumeSpecName: "config") pod "deeb3ad3-4fe8-4faf-9307-5da9988002f6" (UID: "deeb3ad3-4fe8-4faf-9307-5da9988002f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.790235 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.797840 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "deeb3ad3-4fe8-4faf-9307-5da9988002f6" (UID: "deeb3ad3-4fe8-4faf-9307-5da9988002f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.806043 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.818219 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.841692 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-config-data" (OuterVolumeSpecName: "config-data") pod "ccdfe816-90de-4a26-bd0f-8151a34322c8" (UID: "ccdfe816-90de-4a26-bd0f-8151a34322c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.851050 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "deeb3ad3-4fe8-4faf-9307-5da9988002f6" (UID: "deeb3ad3-4fe8-4faf-9307-5da9988002f6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.851122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "deeb3ad3-4fe8-4faf-9307-5da9988002f6" (UID: "deeb3ad3-4fe8-4faf-9307-5da9988002f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.853675 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.855632 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "deeb3ad3-4fe8-4faf-9307-5da9988002f6" (UID: "deeb3ad3-4fe8-4faf-9307-5da9988002f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.867980 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-config-data" (OuterVolumeSpecName: "config-data") pod "327110cc-7c97-45f0-9a06-433a0cd67d0d" (UID: "327110cc-7c97-45f0-9a06-433a0cd67d0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891131 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891158 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891167 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327110cc-7c97-45f0-9a06-433a0cd67d0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891442 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891459 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891515 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccdfe816-90de-4a26-bd0f-8151a34322c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891525 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891534 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891546 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891554 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.891561 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeb3ad3-4fe8-4faf-9307-5da9988002f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.923359 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerStarted","Data":"c58eaf12666196ccde2e6a9e1513d9eb0946fea8f0bf4fe7203c1d2335ba8091"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.923402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerStarted","Data":"0be6efccbce84f3ce85b6927315fc3ea7c9d7fb5689f128e98df6fe068617cb7"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.923412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerStarted","Data":"1ab1aa5ab303eff671d3ed9a9f6be1191a457c3cc5eb7df429dab3171096ada4"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.924289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.927414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ccdfe816-90de-4a26-bd0f-8151a34322c8","Type":"ContainerDied","Data":"3db772312fbe0c82d61b527bb8d9036acd220d3c2ab74ce41d45cdca0231359c"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.927473 4717 scope.go:117] "RemoveContainer" containerID="f21d53979e1dc367da643ec51fb16005e66be2c4c14df1e76d83f3ae0b301dcd" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.927626 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.937303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rrpvp" event={"ID":"a647fe81-8c83-4f0c-996b-1a71081700f0","Type":"ContainerDied","Data":"2c5f2f2ce15b03ec1bfdca97bb9da1c9a7706c167e9a2730baace91d820c5e4c"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.937343 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c5f2f2ce15b03ec1bfdca97bb9da1c9a7706c167e9a2730baace91d820c5e4c" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.937398 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rrpvp" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.951634 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7cc86c585-7kk6w" podStartSLOduration=6.951613975 podStartE2EDuration="6.951613975s" podCreationTimestamp="2026-02-21 22:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:10.943797908 +0000 UTC m=+1005.725331530" watchObservedRunningTime="2026-02-21 22:03:10.951613975 +0000 UTC m=+1005.733147597" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.962448 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerStarted","Data":"20357557a8b45a074e6e7deb634846a7988518e3f89e5e29213d62c60b2739bc"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.963416 4717 scope.go:117] "RemoveContainer" containerID="4ebc6437b28c3f7c0d315587afb1d9c4213ee33e77b4215e5a932e6dd77aa790" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.965208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2h82m" event={"ID":"a945001c-fdf1-4bda-8012-3df96d9781ce","Type":"ContainerStarted","Data":"7e68d9608689e85f550352bf2e011ef2171c7d13c1e06959c6078d29af852392"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.983045 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"327110cc-7c97-45f0-9a06-433a0cd67d0d","Type":"ContainerDied","Data":"712828d495d6e291fe4b774a466c46d4423983fbdcee665b39e1bb70ffeceeb6"} Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.983126 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:10 crc kubenswrapper[4717]: I0221 22:03:10.989386 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5656cf84-5gfzr" event={"ID":"18ff72f4-66a6-4e32-aac7-7f55e600a1ae","Type":"ContainerStarted","Data":"d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009"} Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.002179 4717 scope.go:117] "RemoveContainer" containerID="fb925d246e20ca659a2607767adc29e3624b48567664a449b10bc0256ab85cf7" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.002958 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2h82m" podStartSLOduration=6.713808526 podStartE2EDuration="41.002940555s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="2026-02-21 22:02:31.735093906 +0000 UTC m=+966.516627528" lastFinishedPulling="2026-02-21 22:03:06.024225935 +0000 UTC m=+1000.805759557" observedRunningTime="2026-02-21 22:03:10.978664123 +0000 UTC m=+1005.760197745" watchObservedRunningTime="2026-02-21 22:03:11.002940555 +0000 UTC m=+1005.784474177" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.010068 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" event={"ID":"deeb3ad3-4fe8-4faf-9307-5da9988002f6","Type":"ContainerDied","Data":"4165cf23f8e72f6893a95b8590e85b0cde2cac25c2304d1d7e89f9b04e628e85"} Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.010159 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mp7ws" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.083940 4717 scope.go:117] "RemoveContainer" containerID="bab516f922ba492d23ddaa1702d21dd69e30a88a25e38e086d0c0d25b510d562" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.131391 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.167944 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.189872 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.190413 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerName="init" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.190476 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerName="init" Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.190533 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerName="dnsmasq-dns" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.190586 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerName="dnsmasq-dns" Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.190643 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a647fe81-8c83-4f0c-996b-1a71081700f0" containerName="keystone-bootstrap" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.190696 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a647fe81-8c83-4f0c-996b-1a71081700f0" containerName="keystone-bootstrap" Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.190752 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-httpd" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.190797 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-httpd" Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.190852 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-log" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.190937 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-log" Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.190986 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-log" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191031 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-log" Feb 21 22:03:11 crc kubenswrapper[4717]: E0221 22:03:11.191098 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-httpd" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191146 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-httpd" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191392 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" containerName="dnsmasq-dns" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191469 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-log" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191530 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a647fe81-8c83-4f0c-996b-1a71081700f0" containerName="keystone-bootstrap" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191620 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-httpd" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191689 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" containerName="glance-httpd" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.191743 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" containerName="glance-log" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.192711 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.201991 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.202173 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.202421 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-pp2fz" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.202573 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.221798 4717 scope.go:117] "RemoveContainer" containerID="2c9ed442ca6b3107ad9b7938dd2a39e91b15d48bb914f9df805405565a093e25" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.224376 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.248043 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.261341 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.288113 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.289943 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.294149 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.295900 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.304440 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308247 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-logs\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308362 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-config-data\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308589 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-scripts\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.308732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q667j\" (UniqueName: \"kubernetes.io/projected/52df3aba-f914-4946-877c-696b2a29635e-kube-api-access-q667j\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.338757 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mp7ws"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.348230 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mp7ws"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.410871 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.410933 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q667j\" (UniqueName: \"kubernetes.io/projected/52df3aba-f914-4946-877c-696b2a29635e-kube-api-access-q667j\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.410968 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.410985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-logs\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411011 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411093 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-config-data\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6pwm\" (UniqueName: \"kubernetes.io/projected/d64df97f-f950-4a75-b5a1-7497f752a5cb-kube-api-access-l6pwm\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411161 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-scripts\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.411966 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.412215 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.412352 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-logs\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.419052 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-scripts\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.419854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.422492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.423474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-config-data\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.434484 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q667j\" (UniqueName: \"kubernetes.io/projected/52df3aba-f914-4946-877c-696b2a29635e-kube-api-access-q667j\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.453320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512280 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512371 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6pwm\" (UniqueName: \"kubernetes.io/projected/d64df97f-f950-4a75-b5a1-7497f752a5cb-kube-api-access-l6pwm\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512509 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512526 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.512569 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.513117 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.513373 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.513793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.519734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.528119 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.528269 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.528843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.529067 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.531690 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6pwm\" (UniqueName: \"kubernetes.io/projected/d64df97f-f950-4a75-b5a1-7497f752a5cb-kube-api-access-l6pwm\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.551088 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.621344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.647524 4717 scope.go:117] "RemoveContainer" containerID="65e9cfd12ba025bf858dc4e8fb25c4b474b986d65007e857520136d028b3244c" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.680713 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-86d46bb596-pj8cr"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.683799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.697810 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.698021 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.698075 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lfvb4" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.698396 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.699087 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.709260 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.735246 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86d46bb596-pj8cr"] Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.817467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-internal-tls-certs\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.817778 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-scripts\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.817935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-public-tls-certs\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.818049 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-fernet-keys\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.818197 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-combined-ca-bundle\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.818324 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz4h7\" (UniqueName: \"kubernetes.io/projected/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-kube-api-access-mz4h7\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.818429 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-credential-keys\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.818528 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-config-data\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.919818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-combined-ca-bundle\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.919881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz4h7\" (UniqueName: \"kubernetes.io/projected/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-kube-api-access-mz4h7\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.919902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-config-data\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.919927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-credential-keys\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.920029 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-internal-tls-certs\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.920055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-scripts\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.920075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-public-tls-certs\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.920090 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-fernet-keys\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.925352 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-credential-keys\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.926317 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-public-tls-certs\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.926702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-scripts\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.927217 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-fernet-keys\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.927570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-config-data\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.927775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-internal-tls-certs\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.935804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-combined-ca-bundle\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.956563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz4h7\" (UniqueName: \"kubernetes.io/projected/110e5c1e-4f14-4ab1-a0e0-f54dec9095a2-kube-api-access-mz4h7\") pod \"keystone-86d46bb596-pj8cr\" (UID: \"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2\") " pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.997920 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327110cc-7c97-45f0-9a06-433a0cd67d0d" path="/var/lib/kubelet/pods/327110cc-7c97-45f0-9a06-433a0cd67d0d/volumes" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.999027 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccdfe816-90de-4a26-bd0f-8151a34322c8" path="/var/lib/kubelet/pods/ccdfe816-90de-4a26-bd0f-8151a34322c8/volumes" Feb 21 22:03:11 crc kubenswrapper[4717]: I0221 22:03:11.999827 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deeb3ad3-4fe8-4faf-9307-5da9988002f6" path="/var/lib/kubelet/pods/deeb3ad3-4fe8-4faf-9307-5da9988002f6/volumes" Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.001370 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.068004 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5656cf84-5gfzr" event={"ID":"18ff72f4-66a6-4e32-aac7-7f55e600a1ae","Type":"ContainerStarted","Data":"5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911"} Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.068568 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.068607 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.092814 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f5656cf84-5gfzr" podStartSLOduration=8.092799772 podStartE2EDuration="8.092799772s" podCreationTimestamp="2026-02-21 22:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:12.08729296 +0000 UTC m=+1006.868826582" watchObservedRunningTime="2026-02-21 22:03:12.092799772 +0000 UTC m=+1006.874333394" Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.659285 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86d46bb596-pj8cr"] Feb 21 22:03:12 crc kubenswrapper[4717]: I0221 22:03:12.809962 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:03:13 crc kubenswrapper[4717]: I0221 22:03:13.099810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86d46bb596-pj8cr" event={"ID":"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2","Type":"ContainerStarted","Data":"7e6886823d81283671c5e5e4896708a91b903556784d2b6041534cef5009aff5"} Feb 21 22:03:13 crc kubenswrapper[4717]: I0221 22:03:13.101874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52df3aba-f914-4946-877c-696b2a29635e","Type":"ContainerStarted","Data":"592ebc94154f46d3064064dae5eb1a8f5e6aba38c2cd9cf663ee1a660face980"} Feb 21 22:03:13 crc kubenswrapper[4717]: I0221 22:03:13.446986 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.114535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86d46bb596-pj8cr" event={"ID":"110e5c1e-4f14-4ab1-a0e0-f54dec9095a2","Type":"ContainerStarted","Data":"750d6995f0d5745e140666a7314e3f793a8f9aace306262935fc868ff168be8d"} Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.114754 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.119771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52df3aba-f914-4946-877c-696b2a29635e","Type":"ContainerStarted","Data":"860db4a3cce90dd8b9140766be130786d23775f01cb01b8ebb1444ca9e310a10"} Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.122169 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6xk4j" event={"ID":"3727ff36-57dd-4c91-ab08-d5c87ee4e357","Type":"ContainerStarted","Data":"5a4944f557f86e62b86f768738d5868dbe6540a06594c3111ea7858880ba2707"} Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.123950 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d64df97f-f950-4a75-b5a1-7497f752a5cb","Type":"ContainerStarted","Data":"9e174530a07701efaa6fcf6bbd7666e44ffe3e062cbc6b7b4d3c1e6d6a8dec57"} Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.143632 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-86d46bb596-pj8cr" podStartSLOduration=3.143615627 podStartE2EDuration="3.143615627s" podCreationTimestamp="2026-02-21 22:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:14.133986187 +0000 UTC m=+1008.915519809" watchObservedRunningTime="2026-02-21 22:03:14.143615627 +0000 UTC m=+1008.925149249" Feb 21 22:03:14 crc kubenswrapper[4717]: I0221 22:03:14.158825 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6xk4j" podStartSLOduration=2.572788169 podStartE2EDuration="44.158809971s" podCreationTimestamp="2026-02-21 22:02:30 +0000 UTC" firstStartedPulling="2026-02-21 22:02:31.197338649 +0000 UTC m=+965.978872271" lastFinishedPulling="2026-02-21 22:03:12.783360451 +0000 UTC m=+1007.564894073" observedRunningTime="2026-02-21 22:03:14.149058238 +0000 UTC m=+1008.930591860" watchObservedRunningTime="2026-02-21 22:03:14.158809971 +0000 UTC m=+1008.940343593" Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.158548 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52df3aba-f914-4946-877c-696b2a29635e","Type":"ContainerStarted","Data":"1e689ee64fcdd8877957d289ad22bbeee9f9bc12f3a4a3b3b4e6968b03d20826"} Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.162096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d64df97f-f950-4a75-b5a1-7497f752a5cb","Type":"ContainerStarted","Data":"80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e"} Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.162141 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d64df97f-f950-4a75-b5a1-7497f752a5cb","Type":"ContainerStarted","Data":"6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c"} Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.166972 4717 generic.go:334] "Generic (PLEG): container finished" podID="a945001c-fdf1-4bda-8012-3df96d9781ce" containerID="7e68d9608689e85f550352bf2e011ef2171c7d13c1e06959c6078d29af852392" exitCode=0 Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.167104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2h82m" event={"ID":"a945001c-fdf1-4bda-8012-3df96d9781ce","Type":"ContainerDied","Data":"7e68d9608689e85f550352bf2e011ef2171c7d13c1e06959c6078d29af852392"} Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.177366 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.17735342 podStartE2EDuration="4.17735342s" podCreationTimestamp="2026-02-21 22:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:15.174820509 +0000 UTC m=+1009.956354131" watchObservedRunningTime="2026-02-21 22:03:15.17735342 +0000 UTC m=+1009.958887032" Feb 21 22:03:15 crc kubenswrapper[4717]: I0221 22:03:15.236591 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.236573349 podStartE2EDuration="4.236573349s" podCreationTimestamp="2026-02-21 22:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:15.224271114 +0000 UTC m=+1010.005804736" watchObservedRunningTime="2026-02-21 22:03:15.236573349 +0000 UTC m=+1010.018106971" Feb 21 22:03:17 crc kubenswrapper[4717]: I0221 22:03:17.190320 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/0.log" Feb 21 22:03:17 crc kubenswrapper[4717]: I0221 22:03:17.190736 4717 generic.go:334] "Generic (PLEG): container finished" podID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerID="0be6efccbce84f3ce85b6927315fc3ea7c9d7fb5689f128e98df6fe068617cb7" exitCode=1 Feb 21 22:03:17 crc kubenswrapper[4717]: I0221 22:03:17.190765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerDied","Data":"0be6efccbce84f3ce85b6927315fc3ea7c9d7fb5689f128e98df6fe068617cb7"} Feb 21 22:03:17 crc kubenswrapper[4717]: I0221 22:03:17.191325 4717 scope.go:117] "RemoveContainer" containerID="0be6efccbce84f3ce85b6927315fc3ea7c9d7fb5689f128e98df6fe068617cb7" Feb 21 22:03:17 crc kubenswrapper[4717]: I0221 22:03:17.204077 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7cc86c585-7kk6w" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 21 22:03:18 crc kubenswrapper[4717]: I0221 22:03:18.199019 4717 generic.go:334] "Generic (PLEG): container finished" podID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" containerID="5a4944f557f86e62b86f768738d5868dbe6540a06594c3111ea7858880ba2707" exitCode=0 Feb 21 22:03:18 crc kubenswrapper[4717]: I0221 22:03:18.199375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6xk4j" event={"ID":"3727ff36-57dd-4c91-ab08-d5c87ee4e357","Type":"ContainerDied","Data":"5a4944f557f86e62b86f768738d5868dbe6540a06594c3111ea7858880ba2707"} Feb 21 22:03:18 crc kubenswrapper[4717]: I0221 22:03:18.847028 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Feb 21 22:03:19 crc kubenswrapper[4717]: I0221 22:03:19.044780 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-86b8dffbf6-mztpd" podUID="4e230efb-55a4-4e7f-9d9a-cc61d3123eab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.141911 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.163016 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2h82m" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.229640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2h82m" event={"ID":"a945001c-fdf1-4bda-8012-3df96d9781ce","Type":"ContainerDied","Data":"49750635f2e50d64d2b499f091f7be35db00c3f12d0617cddc3484c1144cd48c"} Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.229679 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49750635f2e50d64d2b499f091f7be35db00c3f12d0617cddc3484c1144cd48c" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.229741 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2h82m" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.241990 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6xk4j" event={"ID":"3727ff36-57dd-4c91-ab08-d5c87ee4e357","Type":"ContainerDied","Data":"f4effad02a5e29ffefe59a3b8e1ac69d5ebe492203fa2ec0b1e88da0db58a205"} Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.242029 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4effad02a5e29ffefe59a3b8e1ac69d5ebe492203fa2ec0b1e88da0db58a205" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.242089 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6xk4j" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.294765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7jlg\" (UniqueName: \"kubernetes.io/projected/3727ff36-57dd-4c91-ab08-d5c87ee4e357-kube-api-access-f7jlg\") pod \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.294839 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-db-sync-config-data\") pod \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.294893 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2n6z\" (UniqueName: \"kubernetes.io/projected/a945001c-fdf1-4bda-8012-3df96d9781ce-kube-api-access-s2n6z\") pod \"a945001c-fdf1-4bda-8012-3df96d9781ce\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.295003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-scripts\") pod \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.295842 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-config-data\") pod \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.296007 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-db-sync-config-data\") pod \"a945001c-fdf1-4bda-8012-3df96d9781ce\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.296597 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-combined-ca-bundle\") pod \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.296709 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3727ff36-57dd-4c91-ab08-d5c87ee4e357-etc-machine-id\") pod \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\" (UID: \"3727ff36-57dd-4c91-ab08-d5c87ee4e357\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.296740 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-combined-ca-bundle\") pod \"a945001c-fdf1-4bda-8012-3df96d9781ce\" (UID: \"a945001c-fdf1-4bda-8012-3df96d9781ce\") " Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.296988 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3727ff36-57dd-4c91-ab08-d5c87ee4e357-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3727ff36-57dd-4c91-ab08-d5c87ee4e357" (UID: "3727ff36-57dd-4c91-ab08-d5c87ee4e357"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.297479 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3727ff36-57dd-4c91-ab08-d5c87ee4e357-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.301164 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a945001c-fdf1-4bda-8012-3df96d9781ce-kube-api-access-s2n6z" (OuterVolumeSpecName: "kube-api-access-s2n6z") pod "a945001c-fdf1-4bda-8012-3df96d9781ce" (UID: "a945001c-fdf1-4bda-8012-3df96d9781ce"). InnerVolumeSpecName "kube-api-access-s2n6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.301271 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-scripts" (OuterVolumeSpecName: "scripts") pod "3727ff36-57dd-4c91-ab08-d5c87ee4e357" (UID: "3727ff36-57dd-4c91-ab08-d5c87ee4e357"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.302019 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3727ff36-57dd-4c91-ab08-d5c87ee4e357" (UID: "3727ff36-57dd-4c91-ab08-d5c87ee4e357"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.302296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a945001c-fdf1-4bda-8012-3df96d9781ce" (UID: "a945001c-fdf1-4bda-8012-3df96d9781ce"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.302675 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3727ff36-57dd-4c91-ab08-d5c87ee4e357-kube-api-access-f7jlg" (OuterVolumeSpecName: "kube-api-access-f7jlg") pod "3727ff36-57dd-4c91-ab08-d5c87ee4e357" (UID: "3727ff36-57dd-4c91-ab08-d5c87ee4e357"). InnerVolumeSpecName "kube-api-access-f7jlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.326480 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a945001c-fdf1-4bda-8012-3df96d9781ce" (UID: "a945001c-fdf1-4bda-8012-3df96d9781ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.332520 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3727ff36-57dd-4c91-ab08-d5c87ee4e357" (UID: "3727ff36-57dd-4c91-ab08-d5c87ee4e357"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.343215 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-config-data" (OuterVolumeSpecName: "config-data") pod "3727ff36-57dd-4c91-ab08-d5c87ee4e357" (UID: "3727ff36-57dd-4c91-ab08-d5c87ee4e357"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.399989 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400022 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400032 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7jlg\" (UniqueName: \"kubernetes.io/projected/3727ff36-57dd-4c91-ab08-d5c87ee4e357-kube-api-access-f7jlg\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400044 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400054 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2n6z\" (UniqueName: \"kubernetes.io/projected/a945001c-fdf1-4bda-8012-3df96d9781ce-kube-api-access-s2n6z\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400063 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400071 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3727ff36-57dd-4c91-ab08-d5c87ee4e357-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.400081 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a945001c-fdf1-4bda-8012-3df96d9781ce-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.482775 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:20 crc kubenswrapper[4717]: E0221 22:03:20.483791 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" containerName="cinder-db-sync" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.483812 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" containerName="cinder-db-sync" Feb 21 22:03:20 crc kubenswrapper[4717]: E0221 22:03:20.483847 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a945001c-fdf1-4bda-8012-3df96d9781ce" containerName="barbican-db-sync" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.483854 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a945001c-fdf1-4bda-8012-3df96d9781ce" containerName="barbican-db-sync" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.484034 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" containerName="cinder-db-sync" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.484052 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a945001c-fdf1-4bda-8012-3df96d9781ce" containerName="barbican-db-sync" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.485166 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.492036 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.493791 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.500591 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.500656 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.500696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.500721 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.500755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ccp\" (UniqueName: \"kubernetes.io/projected/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-kube-api-access-48ccp\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.500772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.521553 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-pk26h"] Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.523002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.552991 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-pk26h"] Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.607718 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-config\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.607807 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.607880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.607978 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdx5m\" (UniqueName: \"kubernetes.io/projected/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-kube-api-access-tdx5m\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608313 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608371 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608398 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608536 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ccp\" (UniqueName: \"kubernetes.io/projected/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-kube-api-access-48ccp\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.608621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.613790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.615573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.615691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.616181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-scripts\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.640394 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ccp\" (UniqueName: \"kubernetes.io/projected/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-kube-api-access-48ccp\") pod \"cinder-scheduler-0\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.710690 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.710751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.710792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.710825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-config\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.710850 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.710901 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdx5m\" (UniqueName: \"kubernetes.io/projected/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-kube-api-access-tdx5m\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.711689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.711713 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.711883 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.712041 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.712097 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-config\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.728583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdx5m\" (UniqueName: \"kubernetes.io/projected/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-kube-api-access-tdx5m\") pod \"dnsmasq-dns-d68b9cb4c-pk26h\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.789921 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.791843 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.805552 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.805759 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.811741 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.811787 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5rgh\" (UniqueName: \"kubernetes.io/projected/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-kube-api-access-x5rgh\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.811819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.811912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.811953 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-logs\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.812001 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-scripts\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.812038 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data-custom\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.819570 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.848475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915034 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5rgh\" (UniqueName: \"kubernetes.io/projected/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-kube-api-access-x5rgh\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-logs\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915227 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-scripts\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915267 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data-custom\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.915347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.918314 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-logs\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.919162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.925790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-scripts\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.926031 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.927263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data-custom\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.933536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:20 crc kubenswrapper[4717]: I0221 22:03:20.947496 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5rgh\" (UniqueName: \"kubernetes.io/projected/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-kube-api-access-x5rgh\") pod \"cinder-api-0\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " pod="openstack/cinder-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.108459 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.500177 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6fbd665b5-2sdwf"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.502396 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.504668 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mpvs9" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.505244 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.505382 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.531086 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.531117 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.536603 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lgkr\" (UniqueName: \"kubernetes.io/projected/30134acb-a272-4da8-a2b6-683e431f593e-kube-api-access-8lgkr\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.536636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30134acb-a272-4da8-a2b6-683e431f593e-logs\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.536658 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-config-data-custom\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.536755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-config-data\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.536781 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-combined-ca-bundle\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.553451 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6fbd665b5-2sdwf"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.577136 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6bbdbc7546-gvgtj"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.591006 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.598819 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.606483 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6bbdbc7546-gvgtj"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.632160 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.632195 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.638213 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lgkr\" (UniqueName: \"kubernetes.io/projected/30134acb-a272-4da8-a2b6-683e431f593e-kube-api-access-8lgkr\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.638380 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30134acb-a272-4da8-a2b6-683e431f593e-logs\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.638464 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-config-data-custom\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.638641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-config-data\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.638729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-combined-ca-bundle\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.639768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30134acb-a272-4da8-a2b6-683e431f593e-logs\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.711701 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-pk26h"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.716222 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.717827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-config-data\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.718438 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-config-data-custom\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.718988 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30134acb-a272-4da8-a2b6-683e431f593e-combined-ca-bundle\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.727723 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lgkr\" (UniqueName: \"kubernetes.io/projected/30134acb-a272-4da8-a2b6-683e431f593e-kube-api-access-8lgkr\") pod \"barbican-worker-6fbd665b5-2sdwf\" (UID: \"30134acb-a272-4da8-a2b6-683e431f593e\") " pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.746268 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-config-data-custom\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.746411 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x244\" (UniqueName: \"kubernetes.io/projected/71cac7a0-f790-43e5-87f9-d3862c20f857-kube-api-access-9x244\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.746442 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-combined-ca-bundle\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.746468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cac7a0-f790-43e5-87f9-d3862c20f857-logs\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.746552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-config-data\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.759320 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-gsz4r"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.766029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.784815 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-gsz4r"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.793762 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-578d8fcbf6-jg7l4"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.795306 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.797516 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.806360 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-578d8fcbf6-jg7l4"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.812264 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x244\" (UniqueName: \"kubernetes.io/projected/71cac7a0-f790-43e5-87f9-d3862c20f857-kube-api-access-9x244\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-combined-ca-bundle\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cac7a0-f790-43e5-87f9-d3862c20f857-logs\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849675 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqpr\" (UniqueName: \"kubernetes.io/projected/33f84b4a-a654-4453-b1cf-c18ae33fd406-kube-api-access-vmqpr\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849705 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-combined-ca-bundle\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data-custom\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-config-data\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849798 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-svc\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849824 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-config-data-custom\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849842 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849884 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcz8\" (UniqueName: \"kubernetes.io/projected/43f65bb5-3e40-46d5-81ef-433c345544ac-kube-api-access-tmcz8\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f84b4a-a654-4453-b1cf-c18ae33fd406-logs\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849955 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.849981 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.850001 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-config\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.851171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71cac7a0-f790-43e5-87f9-d3862c20f857-logs\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.867140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-config-data-custom\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.873237 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-config-data\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.878048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cac7a0-f790-43e5-87f9-d3862c20f857-combined-ca-bundle\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.880801 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x244\" (UniqueName: \"kubernetes.io/projected/71cac7a0-f790-43e5-87f9-d3862c20f857-kube-api-access-9x244\") pod \"barbican-keystone-listener-6bbdbc7546-gvgtj\" (UID: \"71cac7a0-f790-43e5-87f9-d3862c20f857\") " pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.893259 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fbd665b5-2sdwf" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.941065 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.943574 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-pk26h"] Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.949440 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951624 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-config\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951666 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqpr\" (UniqueName: \"kubernetes.io/projected/33f84b4a-a654-4453-b1cf-c18ae33fd406-kube-api-access-vmqpr\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951692 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-combined-ca-bundle\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951732 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data-custom\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951777 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-svc\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcz8\" (UniqueName: \"kubernetes.io/projected/43f65bb5-3e40-46d5-81ef-433c345544ac-kube-api-access-tmcz8\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951887 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f84b4a-a654-4453-b1cf-c18ae33fd406-logs\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.951906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.953992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-config\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.961881 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.962465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.962476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-svc\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.962802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data-custom\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.962823 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.963066 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f84b4a-a654-4453-b1cf-c18ae33fd406-logs\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.970631 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-combined-ca-bundle\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.977932 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" Feb 21 22:03:21 crc kubenswrapper[4717]: I0221 22:03:21.991252 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.004490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqpr\" (UniqueName: \"kubernetes.io/projected/33f84b4a-a654-4453-b1cf-c18ae33fd406-kube-api-access-vmqpr\") pod \"barbican-api-578d8fcbf6-jg7l4\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.018768 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.079792 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcz8\" (UniqueName: \"kubernetes.io/projected/43f65bb5-3e40-46d5-81ef-433c345544ac-kube-api-access-tmcz8\") pod \"dnsmasq-dns-5784cf869f-gsz4r\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.107660 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.139617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.213007 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.350002 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" event={"ID":"45f49dad-ff8b-43fc-ad24-bf3fae05a74f","Type":"ContainerStarted","Data":"51a553031680273e69235a9604d56f0f09eba1913a1e9fd45dc2067090ac8b7a"} Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.417928 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/0.log" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.418028 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerStarted","Data":"36102092b9c64661dccc8a40947d3bd38f49cb5b7cee6da295a6d3cf1e75fc7b"} Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.477970 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5","Type":"ContainerStarted","Data":"d25b7a585c8c8b91beef51025c0825678f4d8622e43e8766484d2a37c6321c38"} Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.487243 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7cc86c585-7kk6w" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.498576 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b","Type":"ContainerStarted","Data":"3e55860e3ba9b287f1b49e50cc81cc20d9e77ca9c7aa32875c34058a210132ce"} Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.498629 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.499029 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.499047 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.499157 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.547594 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6fbd665b5-2sdwf"] Feb 21 22:03:22 crc kubenswrapper[4717]: E0221 22:03:22.673002 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.790253 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-gsz4r"] Feb 21 22:03:22 crc kubenswrapper[4717]: I0221 22:03:22.917951 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6bbdbc7546-gvgtj"] Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.013507 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.027922 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-578d8fcbf6-jg7l4"] Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.558422 4717 generic.go:334] "Generic (PLEG): container finished" podID="45f49dad-ff8b-43fc-ad24-bf3fae05a74f" containerID="8868d1cd7b687472f93ffcdd5c97c28116748460741a1c5dcbef0210cd03fe7a" exitCode=0 Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.558741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" event={"ID":"45f49dad-ff8b-43fc-ad24-bf3fae05a74f","Type":"ContainerDied","Data":"8868d1cd7b687472f93ffcdd5c97c28116748460741a1c5dcbef0210cd03fe7a"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.568541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578d8fcbf6-jg7l4" event={"ID":"33f84b4a-a654-4453-b1cf-c18ae33fd406","Type":"ContainerStarted","Data":"3ce4190a09bac64fd14bbeeb70c39e8a4fb40dfaae0db1d8e649c3b638f962aa"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.613670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fbd665b5-2sdwf" event={"ID":"30134acb-a272-4da8-a2b6-683e431f593e","Type":"ContainerStarted","Data":"e0bf820e3ffa1cd3a21dbfd364cb1dde82efc30e8376d5b86e4e2404623ff0a3"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.623759 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5","Type":"ContainerStarted","Data":"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.626385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" event={"ID":"43f65bb5-3e40-46d5-81ef-433c345544ac","Type":"ContainerStarted","Data":"99a14157e5a62a7cb0b6c5a97e94f36ee6453d662f89287772fa3c2161b3cdbd"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.632373 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" event={"ID":"71cac7a0-f790-43e5-87f9-d3862c20f857","Type":"ContainerStarted","Data":"c967507bf2a10b3114ecdd3aa9f53d69a53ea78267b9fad582d219fc140f1222"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.652994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerStarted","Data":"d2343c28cc082082d1f24088ce657308a8b330cbf4e30a0fea5e7e4f45b38c36"} Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.653192 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="ceilometer-notification-agent" containerID="cri-o://7e423e308e430f4b5eb77d84609197685709348b1f5cd9bb210c457c78f89114" gracePeriod=30 Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.653272 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.653645 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="proxy-httpd" containerID="cri-o://d2343c28cc082082d1f24088ce657308a8b330cbf4e30a0fea5e7e4f45b38c36" gracePeriod=30 Feb 21 22:03:23 crc kubenswrapper[4717]: I0221 22:03:23.653690 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="sg-core" containerID="cri-o://20357557a8b45a074e6e7deb634846a7988518e3f89e5e29213d62c60b2739bc" gracePeriod=30 Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.022449 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.155421 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-nb\") pod \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.155758 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-config\") pod \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.155803 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-swift-storage-0\") pod \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.155832 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdx5m\" (UniqueName: \"kubernetes.io/projected/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-kube-api-access-tdx5m\") pod \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.156051 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-svc\") pod \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.156081 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-sb\") pod \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\" (UID: \"45f49dad-ff8b-43fc-ad24-bf3fae05a74f\") " Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.188245 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-kube-api-access-tdx5m" (OuterVolumeSpecName: "kube-api-access-tdx5m") pod "45f49dad-ff8b-43fc-ad24-bf3fae05a74f" (UID: "45f49dad-ff8b-43fc-ad24-bf3fae05a74f"). InnerVolumeSpecName "kube-api-access-tdx5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.210501 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45f49dad-ff8b-43fc-ad24-bf3fae05a74f" (UID: "45f49dad-ff8b-43fc-ad24-bf3fae05a74f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.227066 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f49dad-ff8b-43fc-ad24-bf3fae05a74f" (UID: "45f49dad-ff8b-43fc-ad24-bf3fae05a74f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.228053 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-config" (OuterVolumeSpecName: "config") pod "45f49dad-ff8b-43fc-ad24-bf3fae05a74f" (UID: "45f49dad-ff8b-43fc-ad24-bf3fae05a74f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.236224 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45f49dad-ff8b-43fc-ad24-bf3fae05a74f" (UID: "45f49dad-ff8b-43fc-ad24-bf3fae05a74f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.267389 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.267426 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.267539 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.267564 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.267578 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdx5m\" (UniqueName: \"kubernetes.io/projected/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-kube-api-access-tdx5m\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.318881 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45f49dad-ff8b-43fc-ad24-bf3fae05a74f" (UID: "45f49dad-ff8b-43fc-ad24-bf3fae05a74f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.371201 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45f49dad-ff8b-43fc-ad24-bf3fae05a74f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.720027 4717 generic.go:334] "Generic (PLEG): container finished" podID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerID="d2343c28cc082082d1f24088ce657308a8b330cbf4e30a0fea5e7e4f45b38c36" exitCode=0 Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.720277 4717 generic.go:334] "Generic (PLEG): container finished" podID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerID="20357557a8b45a074e6e7deb634846a7988518e3f89e5e29213d62c60b2739bc" exitCode=2 Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.720122 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerDied","Data":"d2343c28cc082082d1f24088ce657308a8b330cbf4e30a0fea5e7e4f45b38c36"} Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.720356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerDied","Data":"20357557a8b45a074e6e7deb634846a7988518e3f89e5e29213d62c60b2739bc"} Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.722216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" event={"ID":"45f49dad-ff8b-43fc-ad24-bf3fae05a74f","Type":"ContainerDied","Data":"51a553031680273e69235a9604d56f0f09eba1913a1e9fd45dc2067090ac8b7a"} Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.722260 4717 scope.go:117] "RemoveContainer" containerID="8868d1cd7b687472f93ffcdd5c97c28116748460741a1c5dcbef0210cd03fe7a" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.722437 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-pk26h" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.741699 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578d8fcbf6-jg7l4" event={"ID":"33f84b4a-a654-4453-b1cf-c18ae33fd406","Type":"ContainerStarted","Data":"ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63"} Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.741746 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578d8fcbf6-jg7l4" event={"ID":"33f84b4a-a654-4453-b1cf-c18ae33fd406","Type":"ContainerStarted","Data":"3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631"} Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.743005 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.743031 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.745419 4717 generic.go:334] "Generic (PLEG): container finished" podID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerID="f00fb47b8119e27f62a5f7ddb63182aa1b6cd697608ec90c6df25f93bd8cbbfd" exitCode=0 Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.745459 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" event={"ID":"43f65bb5-3e40-46d5-81ef-433c345544ac","Type":"ContainerDied","Data":"f00fb47b8119e27f62a5f7ddb63182aa1b6cd697608ec90c6df25f93bd8cbbfd"} Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.762896 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-578d8fcbf6-jg7l4" podStartSLOduration=3.762848295 podStartE2EDuration="3.762848295s" podCreationTimestamp="2026-02-21 22:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:24.761771089 +0000 UTC m=+1019.543304711" watchObservedRunningTime="2026-02-21 22:03:24.762848295 +0000 UTC m=+1019.544381917" Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.822927 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-pk26h"] Feb 21 22:03:24 crc kubenswrapper[4717]: I0221 22:03:24.832906 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-pk26h"] Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.462706 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.463041 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.467731 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.467795 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.469893 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.561694 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.766065 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5","Type":"ContainerStarted","Data":"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3"} Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.766212 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api-log" containerID="cri-o://6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa" gracePeriod=30 Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.766499 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.766728 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api" containerID="cri-o://a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3" gracePeriod=30 Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.778612 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" event={"ID":"43f65bb5-3e40-46d5-81ef-433c345544ac","Type":"ContainerStarted","Data":"d955b1b3aafbd0d0b01af76c2163bd00378a9a0ba0fb36bf775af1818075a4cb"} Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.779447 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.783690 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b","Type":"ContainerStarted","Data":"8a034c2b39ce2ff13c6f05c5bb37b400cbed75ccf99d3f32374508ab4aaf231a"} Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.783731 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b","Type":"ContainerStarted","Data":"1941e3ed1896ccd75f5724e696cdcba44836764c82a487e4a23ca3fb61898ab0"} Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.785553 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.785540763 podStartE2EDuration="5.785540763s" podCreationTimestamp="2026-02-21 22:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:25.781886066 +0000 UTC m=+1020.563419688" watchObservedRunningTime="2026-02-21 22:03:25.785540763 +0000 UTC m=+1020.567074385" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.806004 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.813717 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" podStartSLOduration=4.813695347 podStartE2EDuration="4.813695347s" podCreationTimestamp="2026-02-21 22:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:25.802365886 +0000 UTC m=+1020.583899508" watchObservedRunningTime="2026-02-21 22:03:25.813695347 +0000 UTC m=+1020.595228969" Feb 21 22:03:25 crc kubenswrapper[4717]: I0221 22:03:25.823454 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.486999716 podStartE2EDuration="5.823436111s" podCreationTimestamp="2026-02-21 22:03:20 +0000 UTC" firstStartedPulling="2026-02-21 22:03:22.063630712 +0000 UTC m=+1016.845164334" lastFinishedPulling="2026-02-21 22:03:23.400067107 +0000 UTC m=+1018.181600729" observedRunningTime="2026-02-21 22:03:25.819742913 +0000 UTC m=+1020.601276535" watchObservedRunningTime="2026-02-21 22:03:25.823436111 +0000 UTC m=+1020.604969733" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.011697 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f49dad-ff8b-43fc-ad24-bf3fae05a74f" path="/var/lib/kubelet/pods/45f49dad-ff8b-43fc-ad24-bf3fae05a74f/volumes" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.809702 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.833515 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fbd665b5-2sdwf" event={"ID":"30134acb-a272-4da8-a2b6-683e431f593e","Type":"ContainerStarted","Data":"c3284d5774f4cca0956e45ca230d81dc9ee32bc33c4bb5dcd2e860e352b2b2ef"} Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836813 4717 generic.go:334] "Generic (PLEG): container finished" podID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerID="a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3" exitCode=0 Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836845 4717 generic.go:334] "Generic (PLEG): container finished" podID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerID="6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa" exitCode=143 Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836896 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836915 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5","Type":"ContainerDied","Data":"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3"} Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5","Type":"ContainerDied","Data":"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa"} Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836953 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5","Type":"ContainerDied","Data":"d25b7a585c8c8b91beef51025c0825678f4d8622e43e8766484d2a37c6321c38"} Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.836967 4717 scope.go:117] "RemoveContainer" containerID="a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.852551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" event={"ID":"71cac7a0-f790-43e5-87f9-d3862c20f857","Type":"ContainerStarted","Data":"bbf6657734923dfdd2c42e9a7680db7334a554bbdef515890ab49303b8e8348d"} Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.866816 4717 scope.go:117] "RemoveContainer" containerID="6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.922358 4717 scope.go:117] "RemoveContainer" containerID="a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3" Feb 21 22:03:26 crc kubenswrapper[4717]: E0221 22:03:26.923310 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3\": container with ID starting with a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3 not found: ID does not exist" containerID="a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.923346 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3"} err="failed to get container status \"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3\": rpc error: code = NotFound desc = could not find container \"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3\": container with ID starting with a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3 not found: ID does not exist" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.923372 4717 scope.go:117] "RemoveContainer" containerID="6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa" Feb 21 22:03:26 crc kubenswrapper[4717]: E0221 22:03:26.923803 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa\": container with ID starting with 6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa not found: ID does not exist" containerID="6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.923845 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa"} err="failed to get container status \"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa\": rpc error: code = NotFound desc = could not find container \"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa\": container with ID starting with 6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa not found: ID does not exist" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.923880 4717 scope.go:117] "RemoveContainer" containerID="a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.924464 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3"} err="failed to get container status \"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3\": rpc error: code = NotFound desc = could not find container \"a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3\": container with ID starting with a4db18e790aebb548c2d0bb8291a61646e9e5fb13dc6af1bb6bc2533da30e0c3 not found: ID does not exist" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.924486 4717 scope.go:117] "RemoveContainer" containerID="6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.925160 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa"} err="failed to get container status \"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa\": rpc error: code = NotFound desc = could not find container \"6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa\": container with ID starting with 6733ff55f30712fd4379cbbe442c3dc145ef0ef9f237a3f0e6f282eb2f7517fa not found: ID does not exist" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.931992 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932134 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data-custom\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932169 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-combined-ca-bundle\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932224 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-logs\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932248 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5rgh\" (UniqueName: \"kubernetes.io/projected/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-kube-api-access-x5rgh\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932327 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-scripts\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932811 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-logs" (OuterVolumeSpecName: "logs") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.932986 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-etc-machine-id\") pod \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\" (UID: \"a29d9a59-2788-4eed-b6c9-c1a8648bd5e5\") " Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.933502 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.934016 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.939190 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-kube-api-access-x5rgh" (OuterVolumeSpecName: "kube-api-access-x5rgh") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "kube-api-access-x5rgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.960473 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-scripts" (OuterVolumeSpecName: "scripts") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.960597 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:26 crc kubenswrapper[4717]: I0221 22:03:26.997050 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.035111 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.035143 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.035154 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.035163 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.035172 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5rgh\" (UniqueName: \"kubernetes.io/projected/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-kube-api-access-x5rgh\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.046487 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data" (OuterVolumeSpecName: "config-data") pod "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" (UID: "a29d9a59-2788-4eed-b6c9-c1a8648bd5e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.136973 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.173877 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.194781 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-df9cd9b6-x9br7"] Feb 21 22:03:27 crc kubenswrapper[4717]: E0221 22:03:27.195238 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api-log" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.195260 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api-log" Feb 21 22:03:27 crc kubenswrapper[4717]: E0221 22:03:27.195288 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.195295 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api" Feb 21 22:03:27 crc kubenswrapper[4717]: E0221 22:03:27.195309 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f49dad-ff8b-43fc-ad24-bf3fae05a74f" containerName="init" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.195319 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f49dad-ff8b-43fc-ad24-bf3fae05a74f" containerName="init" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.195533 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.195555 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" containerName="cinder-api-log" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.195570 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f49dad-ff8b-43fc-ad24-bf3fae05a74f" containerName="init" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.196501 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.200219 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.201412 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.208296 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.222723 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-df9cd9b6-x9br7"] Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.268710 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.270071 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.276192 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.276410 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.277256 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.324993 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365050 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb8dh\" (UniqueName: \"kubernetes.io/projected/db0133ac-ab76-4b9d-a3e9-e55a095f919a-kube-api-access-wb8dh\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365092 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365111 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-scripts\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365131 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-internal-tls-certs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365152 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-config-data\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365166 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db0133ac-ab76-4b9d-a3e9-e55a095f919a-logs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365191 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-config-data-custom\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365216 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365240 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ts7s\" (UniqueName: \"kubernetes.io/projected/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-kube-api-access-8ts7s\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-combined-ca-bundle\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365282 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365299 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-config-data-custom\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365316 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-logs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365362 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-public-tls-certs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365379 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-config-data\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.365406 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db0133ac-ab76-4b9d-a3e9-e55a095f919a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466620 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ts7s\" (UniqueName: \"kubernetes.io/projected/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-kube-api-access-8ts7s\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-combined-ca-bundle\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466677 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466692 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-config-data-custom\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466713 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-logs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466765 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-public-tls-certs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-config-data\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db0133ac-ab76-4b9d-a3e9-e55a095f919a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466869 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb8dh\" (UniqueName: \"kubernetes.io/projected/db0133ac-ab76-4b9d-a3e9-e55a095f919a-kube-api-access-wb8dh\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466886 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-scripts\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-internal-tls-certs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466947 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-config-data\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db0133ac-ab76-4b9d-a3e9-e55a095f919a-logs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.466991 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-config-data-custom\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.468105 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db0133ac-ab76-4b9d-a3e9-e55a095f919a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.472914 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.473166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-config-data-custom\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.473295 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db0133ac-ab76-4b9d-a3e9-e55a095f919a-logs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.473368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-logs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.473536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.474548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-config-data\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.479317 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-internal-tls-certs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.481233 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-combined-ca-bundle\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.481412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.482171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-public-tls-certs\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.483296 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb8dh\" (UniqueName: \"kubernetes.io/projected/db0133ac-ab76-4b9d-a3e9-e55a095f919a-kube-api-access-wb8dh\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.484477 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db0133ac-ab76-4b9d-a3e9-e55a095f919a-scripts\") pod \"cinder-api-0\" (UID: \"db0133ac-ab76-4b9d-a3e9-e55a095f919a\") " pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.484679 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-config-data\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.485359 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-config-data-custom\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.505565 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ts7s\" (UniqueName: \"kubernetes.io/projected/ace10c13-1b1b-4e3a-8f58-b8dab0c80704-kube-api-access-8ts7s\") pod \"barbican-api-df9cd9b6-x9br7\" (UID: \"ace10c13-1b1b-4e3a-8f58-b8dab0c80704\") " pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.509515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.594146 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.867782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fbd665b5-2sdwf" event={"ID":"30134acb-a272-4da8-a2b6-683e431f593e","Type":"ContainerStarted","Data":"ff5fdccee20def5c4079556ed5710ae3cd0082203e1b814af91d731241f21d06"} Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.879905 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" event={"ID":"71cac7a0-f790-43e5-87f9-d3862c20f857","Type":"ContainerStarted","Data":"8e8b082dffbf71d875f32a4b0009b339435ecdd87c29ca24c626fd844dcecf9a"} Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.886365 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6fbd665b5-2sdwf" podStartSLOduration=3.005117408 podStartE2EDuration="6.886350927s" podCreationTimestamp="2026-02-21 22:03:21 +0000 UTC" firstStartedPulling="2026-02-21 22:03:22.578267985 +0000 UTC m=+1017.359801597" lastFinishedPulling="2026-02-21 22:03:26.459501494 +0000 UTC m=+1021.241035116" observedRunningTime="2026-02-21 22:03:27.884564853 +0000 UTC m=+1022.666098475" watchObservedRunningTime="2026-02-21 22:03:27.886350927 +0000 UTC m=+1022.667884549" Feb 21 22:03:27 crc kubenswrapper[4717]: I0221 22:03:27.924114 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6bbdbc7546-gvgtj" podStartSLOduration=3.859955723 podStartE2EDuration="6.924095411s" podCreationTimestamp="2026-02-21 22:03:21 +0000 UTC" firstStartedPulling="2026-02-21 22:03:23.390451698 +0000 UTC m=+1018.171985330" lastFinishedPulling="2026-02-21 22:03:26.454591396 +0000 UTC m=+1021.236125018" observedRunningTime="2026-02-21 22:03:27.913331213 +0000 UTC m=+1022.694864835" watchObservedRunningTime="2026-02-21 22:03:27.924095411 +0000 UTC m=+1022.705629033" Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.004626 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29d9a59-2788-4eed-b6c9-c1a8648bd5e5" path="/var/lib/kubelet/pods/a29d9a59-2788-4eed-b6c9-c1a8648bd5e5/volumes" Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.127681 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-df9cd9b6-x9br7"] Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.278785 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.912256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df9cd9b6-x9br7" event={"ID":"ace10c13-1b1b-4e3a-8f58-b8dab0c80704","Type":"ContainerStarted","Data":"f40bde37c1f16e0c466b3fedf07a3b55a9bad97bbcdd7d8b1c56b7c9862eb012"} Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.912655 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df9cd9b6-x9br7" event={"ID":"ace10c13-1b1b-4e3a-8f58-b8dab0c80704","Type":"ContainerStarted","Data":"dedb52242fd343ccfa2acc035bdc6957c988f26cd18dc3e6569db61b525d8ce1"} Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.914435 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.917337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db0133ac-ab76-4b9d-a3e9-e55a095f919a","Type":"ContainerStarted","Data":"37f2f764af34163736e7d5e0c5be3c19c42addacf7aed0ad8d446e33d1fe0837"} Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.919222 4717 generic.go:334] "Generic (PLEG): container finished" podID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerID="7e423e308e430f4b5eb77d84609197685709348b1f5cd9bb210c457c78f89114" exitCode=0 Feb 21 22:03:28 crc kubenswrapper[4717]: I0221 22:03:28.919904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerDied","Data":"7e423e308e430f4b5eb77d84609197685709348b1f5cd9bb210c457c78f89114"} Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.263248 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cc86c585-7kk6w"] Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.265547 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cc86c585-7kk6w" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" containerID="cri-o://c58eaf12666196ccde2e6a9e1513d9eb0946fea8f0bf4fe7203c1d2335ba8091" gracePeriod=30 Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.266442 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cc86c585-7kk6w" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" containerID="cri-o://36102092b9c64661dccc8a40947d3bd38f49cb5b7cee6da295a6d3cf1e75fc7b" gracePeriod=30 Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.292896 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7cc86c585-7kk6w" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": read tcp 10.217.0.2:39022->10.217.0.156:9696: read: connection reset by peer" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.310339 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-677cdf8c9f-j2vl7"] Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.312643 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.321914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-httpd-config\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.321958 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-ovndb-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.321986 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-combined-ca-bundle\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.322005 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-internal-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.322023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-public-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.322039 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxwr\" (UniqueName: \"kubernetes.io/projected/5aad4222-11d4-4bb0-9a90-b9924339c70e-kube-api-access-5bxwr\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.322075 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-config\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.327492 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-677cdf8c9f-j2vl7"] Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-httpd-config\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-ovndb-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423471 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-combined-ca-bundle\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423488 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-internal-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-public-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423519 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxwr\" (UniqueName: \"kubernetes.io/projected/5aad4222-11d4-4bb0-9a90-b9924339c70e-kube-api-access-5bxwr\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.423560 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-config\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.431001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-combined-ca-bundle\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.435694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-config\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.436623 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-ovndb-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.436655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-public-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.437204 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-httpd-config\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.437404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aad4222-11d4-4bb0-9a90-b9924339c70e-internal-tls-certs\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.447229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxwr\" (UniqueName: \"kubernetes.io/projected/5aad4222-11d4-4bb0-9a90-b9924339c70e-kube-api-access-5bxwr\") pod \"neutron-677cdf8c9f-j2vl7\" (UID: \"5aad4222-11d4-4bb0-9a90-b9924339c70e\") " pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.548027 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.555397 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.626960 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-log-httpd\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627068 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq4cc\" (UniqueName: \"kubernetes.io/projected/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-kube-api-access-kq4cc\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627122 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-sg-core-conf-yaml\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627146 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-config-data\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-combined-ca-bundle\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627332 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-scripts\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-run-httpd\") pod \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\" (UID: \"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e\") " Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.627782 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.628290 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.628554 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.633277 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-kube-api-access-kq4cc" (OuterVolumeSpecName: "kube-api-access-kq4cc") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "kube-api-access-kq4cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.656724 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-scripts" (OuterVolumeSpecName: "scripts") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.702414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.723081 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.728902 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.728920 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.728930 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq4cc\" (UniqueName: \"kubernetes.io/projected/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-kube-api-access-kq4cc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.728939 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.728947 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.818492 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-config-data" (OuterVolumeSpecName: "config-data") pod "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" (UID: "7ff6e9c3-c94b-43ec-bba0-6e180be99f9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.829972 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.946372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db0133ac-ab76-4b9d-a3e9-e55a095f919a","Type":"ContainerStarted","Data":"fc0864195725c51ca2177bb5ed2cc19e5a2a726d945a93bbe0d9848576c88bd0"} Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.950356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ff6e9c3-c94b-43ec-bba0-6e180be99f9e","Type":"ContainerDied","Data":"310da457da96fdb6e65cb1262e97f3ee1bbc39cc77fb5e92d5eca3e4d2723b81"} Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.950465 4717 scope.go:117] "RemoveContainer" containerID="d2343c28cc082082d1f24088ce657308a8b330cbf4e30a0fea5e7e4f45b38c36" Feb 21 22:03:29 crc kubenswrapper[4717]: I0221 22:03:29.950374 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.024141 4717 scope.go:117] "RemoveContainer" containerID="20357557a8b45a074e6e7deb634846a7988518e3f89e5e29213d62c60b2739bc" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.024330 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.041014 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.048056 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/0.log" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.048123 4717 generic.go:334] "Generic (PLEG): container finished" podID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerID="c58eaf12666196ccde2e6a9e1513d9eb0946fea8f0bf4fe7203c1d2335ba8091" exitCode=0 Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.048241 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerDied","Data":"c58eaf12666196ccde2e6a9e1513d9eb0946fea8f0bf4fe7203c1d2335ba8091"} Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.065319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-df9cd9b6-x9br7" event={"ID":"ace10c13-1b1b-4e3a-8f58-b8dab0c80704","Type":"ContainerStarted","Data":"463cfa8140b72c9d9d0406251227d17bb0c30891d459aeb30d98c5fdbb0af50e"} Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.065980 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.066032 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.078137 4717 scope.go:117] "RemoveContainer" containerID="7e423e308e430f4b5eb77d84609197685709348b1f5cd9bb210c457c78f89114" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.082922 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:30 crc kubenswrapper[4717]: E0221 22:03:30.083637 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="proxy-httpd" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.083654 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="proxy-httpd" Feb 21 22:03:30 crc kubenswrapper[4717]: E0221 22:03:30.083675 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="ceilometer-notification-agent" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.083682 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="ceilometer-notification-agent" Feb 21 22:03:30 crc kubenswrapper[4717]: E0221 22:03:30.083714 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="sg-core" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.083721 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="sg-core" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.083912 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="sg-core" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.083935 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="ceilometer-notification-agent" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.083949 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" containerName="proxy-httpd" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.085621 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.088189 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.091814 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.096298 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.114142 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-df9cd9b6-x9br7" podStartSLOduration=3.114124373 podStartE2EDuration="3.114124373s" podCreationTimestamp="2026-02-21 22:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:30.10652803 +0000 UTC m=+1024.888061652" watchObservedRunningTime="2026-02-21 22:03:30.114124373 +0000 UTC m=+1024.895657995" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.166669 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-677cdf8c9f-j2vl7"] Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwnwt\" (UniqueName: \"kubernetes.io/projected/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-kube-api-access-xwnwt\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242292 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-log-httpd\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242389 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-scripts\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242465 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-config-data\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242501 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-run-httpd\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.242552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345523 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-scripts\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-config-data\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345623 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-run-httpd\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345659 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwnwt\" (UniqueName: \"kubernetes.io/projected/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-kube-api-access-xwnwt\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-log-httpd\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.345797 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.346162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-run-httpd\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.346472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-log-httpd\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.348954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-scripts\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.351503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.369951 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.370244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-config-data\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.375833 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwnwt\" (UniqueName: \"kubernetes.io/projected/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-kube-api-access-xwnwt\") pod \"ceilometer-0\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " pod="openstack/ceilometer-0" Feb 21 22:03:30 crc kubenswrapper[4717]: I0221 22:03:30.428376 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.089208 4717 generic.go:334] "Generic (PLEG): container finished" podID="439eab0e-0489-4a97-993e-c6c3df03e694" containerID="86c54231ad6bf38342e808fd0a76f35e200e9ceaff8aa1c46db4a352090a3b06" exitCode=137 Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.089445 4717 generic.go:334] "Generic (PLEG): container finished" podID="439eab0e-0489-4a97-993e-c6c3df03e694" containerID="8543a25dbf077d192eea9219229f62802d037cf29feaa2444a1507cb3cc6e9e7" exitCode=137 Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.089523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ff8b8ffdf-4dwxk" event={"ID":"439eab0e-0489-4a97-993e-c6c3df03e694","Type":"ContainerDied","Data":"86c54231ad6bf38342e808fd0a76f35e200e9ceaff8aa1c46db4a352090a3b06"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.089549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ff8b8ffdf-4dwxk" event={"ID":"439eab0e-0489-4a97-993e-c6c3df03e694","Type":"ContainerDied","Data":"8543a25dbf077d192eea9219229f62802d037cf29feaa2444a1507cb3cc6e9e7"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.094068 4717 generic.go:334] "Generic (PLEG): container finished" podID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerID="e150b7b66dcbe638973e20a725f834dd974991b8313f7237c6399a518093e9e7" exitCode=137 Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.094097 4717 generic.go:334] "Generic (PLEG): container finished" podID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerID="208618fb7dba17474af74671a8910f21495f0c58490ed4c913725f2d1bc0de38" exitCode=137 Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.094136 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5d497dbf-bkvz6" event={"ID":"09f86cea-494d-4f11-b9f5-2045f7aabd92","Type":"ContainerDied","Data":"e150b7b66dcbe638973e20a725f834dd974991b8313f7237c6399a518093e9e7"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.094161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5d497dbf-bkvz6" event={"ID":"09f86cea-494d-4f11-b9f5-2045f7aabd92","Type":"ContainerDied","Data":"208618fb7dba17474af74671a8910f21495f0c58490ed4c913725f2d1bc0de38"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.101235 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-677cdf8c9f-j2vl7" event={"ID":"5aad4222-11d4-4bb0-9a90-b9924339c70e","Type":"ContainerStarted","Data":"6279aed0707cdedd9c08998a2750fd7dbab08400062cff5e9a450111febb8591"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.101262 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-677cdf8c9f-j2vl7" event={"ID":"5aad4222-11d4-4bb0-9a90-b9924339c70e","Type":"ContainerStarted","Data":"9337c00dd2094ade49c1287680f9b53d7c59edafb44efb4b58911d54791cbba4"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.101272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-677cdf8c9f-j2vl7" event={"ID":"5aad4222-11d4-4bb0-9a90-b9924339c70e","Type":"ContainerStarted","Data":"d8f2598affdcbf0f110c3d44b0d384645451aa262d59d2ac1199bc77cf91febf"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.101479 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.112128 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.116174 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.116469 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea65478f-2244-4670-9821-526d00eb1b9a" containerID="3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74" exitCode=137 Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.116541 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea65478f-2244-4670-9821-526d00eb1b9a" containerID="e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7" exitCode=137 Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.116631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6f587885-vvp7f" event={"ID":"ea65478f-2244-4670-9821-526d00eb1b9a","Type":"ContainerDied","Data":"3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.116702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6f587885-vvp7f" event={"ID":"ea65478f-2244-4670-9821-526d00eb1b9a","Type":"ContainerDied","Data":"e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.116772 4717 scope.go:117] "RemoveContainer" containerID="3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.122873 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-677cdf8c9f-j2vl7" podStartSLOduration=2.122845665 podStartE2EDuration="2.122845665s" podCreationTimestamp="2026-02-21 22:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:31.12136419 +0000 UTC m=+1025.902897812" watchObservedRunningTime="2026-02-21 22:03:31.122845665 +0000 UTC m=+1025.904379287" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.124558 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"db0133ac-ab76-4b9d-a3e9-e55a095f919a","Type":"ContainerStarted","Data":"451cfaf801cd58086785310b2337be36a4b896e831d44fc11b2e559e2c6810e6"} Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.125453 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.139331 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.186993 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.198918 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdhmn\" (UniqueName: \"kubernetes.io/projected/439eab0e-0489-4a97-993e-c6c3df03e694-kube-api-access-vdhmn\") pod \"439eab0e-0489-4a97-993e-c6c3df03e694\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215434 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-config-data\") pod \"439eab0e-0489-4a97-993e-c6c3df03e694\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439eab0e-0489-4a97-993e-c6c3df03e694-logs\") pod \"439eab0e-0489-4a97-993e-c6c3df03e694\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215511 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-scripts\") pod \"ea65478f-2244-4670-9821-526d00eb1b9a\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215573 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea65478f-2244-4670-9821-526d00eb1b9a-horizon-secret-key\") pod \"ea65478f-2244-4670-9821-526d00eb1b9a\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215568 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.215546836 podStartE2EDuration="4.215546836s" podCreationTimestamp="2026-02-21 22:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:31.208125899 +0000 UTC m=+1025.989659521" watchObservedRunningTime="2026-02-21 22:03:31.215546836 +0000 UTC m=+1025.997080458" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.216581 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439eab0e-0489-4a97-993e-c6c3df03e694-logs" (OuterVolumeSpecName: "logs") pod "439eab0e-0489-4a97-993e-c6c3df03e694" (UID: "439eab0e-0489-4a97-993e-c6c3df03e694"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.215598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqm7c\" (UniqueName: \"kubernetes.io/projected/ea65478f-2244-4670-9821-526d00eb1b9a-kube-api-access-fqm7c\") pod \"ea65478f-2244-4670-9821-526d00eb1b9a\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.217057 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-scripts\") pod \"439eab0e-0489-4a97-993e-c6c3df03e694\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.217095 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/439eab0e-0489-4a97-993e-c6c3df03e694-horizon-secret-key\") pod \"439eab0e-0489-4a97-993e-c6c3df03e694\" (UID: \"439eab0e-0489-4a97-993e-c6c3df03e694\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.217149 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-config-data\") pod \"ea65478f-2244-4670-9821-526d00eb1b9a\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.217184 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea65478f-2244-4670-9821-526d00eb1b9a-logs\") pod \"ea65478f-2244-4670-9821-526d00eb1b9a\" (UID: \"ea65478f-2244-4670-9821-526d00eb1b9a\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.218153 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/439eab0e-0489-4a97-993e-c6c3df03e694-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.227140 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea65478f-2244-4670-9821-526d00eb1b9a-kube-api-access-fqm7c" (OuterVolumeSpecName: "kube-api-access-fqm7c") pod "ea65478f-2244-4670-9821-526d00eb1b9a" (UID: "ea65478f-2244-4670-9821-526d00eb1b9a"). InnerVolumeSpecName "kube-api-access-fqm7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.227886 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea65478f-2244-4670-9821-526d00eb1b9a-logs" (OuterVolumeSpecName: "logs") pod "ea65478f-2244-4670-9821-526d00eb1b9a" (UID: "ea65478f-2244-4670-9821-526d00eb1b9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.229053 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439eab0e-0489-4a97-993e-c6c3df03e694-kube-api-access-vdhmn" (OuterVolumeSpecName: "kube-api-access-vdhmn") pod "439eab0e-0489-4a97-993e-c6c3df03e694" (UID: "439eab0e-0489-4a97-993e-c6c3df03e694"). InnerVolumeSpecName "kube-api-access-vdhmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.229525 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea65478f-2244-4670-9821-526d00eb1b9a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ea65478f-2244-4670-9821-526d00eb1b9a" (UID: "ea65478f-2244-4670-9821-526d00eb1b9a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.234000 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/439eab0e-0489-4a97-993e-c6c3df03e694-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "439eab0e-0489-4a97-993e-c6c3df03e694" (UID: "439eab0e-0489-4a97-993e-c6c3df03e694"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.269594 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-scripts" (OuterVolumeSpecName: "scripts") pod "ea65478f-2244-4670-9821-526d00eb1b9a" (UID: "ea65478f-2244-4670-9821-526d00eb1b9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.270507 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-config-data" (OuterVolumeSpecName: "config-data") pod "439eab0e-0489-4a97-993e-c6c3df03e694" (UID: "439eab0e-0489-4a97-993e-c6c3df03e694"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.272732 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-config-data" (OuterVolumeSpecName: "config-data") pod "ea65478f-2244-4670-9821-526d00eb1b9a" (UID: "ea65478f-2244-4670-9821-526d00eb1b9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.289448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-scripts" (OuterVolumeSpecName: "scripts") pod "439eab0e-0489-4a97-993e-c6c3df03e694" (UID: "439eab0e-0489-4a97-993e-c6c3df03e694"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320187 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320221 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea65478f-2244-4670-9821-526d00eb1b9a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320234 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqm7c\" (UniqueName: \"kubernetes.io/projected/ea65478f-2244-4670-9821-526d00eb1b9a-kube-api-access-fqm7c\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320244 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320252 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/439eab0e-0489-4a97-993e-c6c3df03e694-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320261 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea65478f-2244-4670-9821-526d00eb1b9a-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320269 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea65478f-2244-4670-9821-526d00eb1b9a-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320277 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdhmn\" (UniqueName: \"kubernetes.io/projected/439eab0e-0489-4a97-993e-c6c3df03e694-kube-api-access-vdhmn\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.320285 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/439eab0e-0489-4a97-993e-c6c3df03e694-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.368086 4717 scope.go:117] "RemoveContainer" containerID="e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.414532 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.458486 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.468973 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.473433 4717 scope.go:117] "RemoveContainer" containerID="3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74" Feb 21 22:03:31 crc kubenswrapper[4717]: E0221 22:03:31.473841 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74\": container with ID starting with 3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74 not found: ID does not exist" containerID="3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.473892 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74"} err="failed to get container status \"3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74\": rpc error: code = NotFound desc = could not find container \"3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74\": container with ID starting with 3ac8b97244cc690abebbaa031bc67fa0f674a3eb556ca825dac2ea5e36a68c74 not found: ID does not exist" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.473917 4717 scope.go:117] "RemoveContainer" containerID="e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7" Feb 21 22:03:31 crc kubenswrapper[4717]: E0221 22:03:31.474219 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7\": container with ID starting with e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7 not found: ID does not exist" containerID="e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.474247 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7"} err="failed to get container status \"e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7\": rpc error: code = NotFound desc = could not find container \"e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7\": container with ID starting with e504ae7a5cf31b49bcfa96a5edbc150c8e83603ed82f92f5b46325353da9c6d7 not found: ID does not exist" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.625450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-config-data\") pod \"09f86cea-494d-4f11-b9f5-2045f7aabd92\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.625589 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-scripts\") pod \"09f86cea-494d-4f11-b9f5-2045f7aabd92\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.625635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pchpc\" (UniqueName: \"kubernetes.io/projected/09f86cea-494d-4f11-b9f5-2045f7aabd92-kube-api-access-pchpc\") pod \"09f86cea-494d-4f11-b9f5-2045f7aabd92\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.625743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09f86cea-494d-4f11-b9f5-2045f7aabd92-horizon-secret-key\") pod \"09f86cea-494d-4f11-b9f5-2045f7aabd92\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.625798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f86cea-494d-4f11-b9f5-2045f7aabd92-logs\") pod \"09f86cea-494d-4f11-b9f5-2045f7aabd92\" (UID: \"09f86cea-494d-4f11-b9f5-2045f7aabd92\") " Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.626302 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f86cea-494d-4f11-b9f5-2045f7aabd92-logs" (OuterVolumeSpecName: "logs") pod "09f86cea-494d-4f11-b9f5-2045f7aabd92" (UID: "09f86cea-494d-4f11-b9f5-2045f7aabd92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.633758 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f86cea-494d-4f11-b9f5-2045f7aabd92-kube-api-access-pchpc" (OuterVolumeSpecName: "kube-api-access-pchpc") pod "09f86cea-494d-4f11-b9f5-2045f7aabd92" (UID: "09f86cea-494d-4f11-b9f5-2045f7aabd92"). InnerVolumeSpecName "kube-api-access-pchpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.645089 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f86cea-494d-4f11-b9f5-2045f7aabd92-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "09f86cea-494d-4f11-b9f5-2045f7aabd92" (UID: "09f86cea-494d-4f11-b9f5-2045f7aabd92"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.648305 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-scripts" (OuterVolumeSpecName: "scripts") pod "09f86cea-494d-4f11-b9f5-2045f7aabd92" (UID: "09f86cea-494d-4f11-b9f5-2045f7aabd92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.651828 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-config-data" (OuterVolumeSpecName: "config-data") pod "09f86cea-494d-4f11-b9f5-2045f7aabd92" (UID: "09f86cea-494d-4f11-b9f5-2045f7aabd92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.727932 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.727962 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f86cea-494d-4f11-b9f5-2045f7aabd92-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.727972 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pchpc\" (UniqueName: \"kubernetes.io/projected/09f86cea-494d-4f11-b9f5-2045f7aabd92-kube-api-access-pchpc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.727982 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09f86cea-494d-4f11-b9f5-2045f7aabd92-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:31 crc kubenswrapper[4717]: I0221 22:03:31.727990 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f86cea-494d-4f11-b9f5-2045f7aabd92-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.004650 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff6e9c3-c94b-43ec-bba0-6e180be99f9e" path="/var/lib/kubelet/pods/7ff6e9c3-c94b-43ec-bba0-6e180be99f9e/volumes" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.111057 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.150310 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6f587885-vvp7f" event={"ID":"ea65478f-2244-4670-9821-526d00eb1b9a","Type":"ContainerDied","Data":"85ab1d9e268f6662ef9f77d4dfe144b0aa9f409bbff245295e0f507ea52b2837"} Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.150315 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6f587885-vvp7f" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.155386 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ff8b8ffdf-4dwxk" event={"ID":"439eab0e-0489-4a97-993e-c6c3df03e694","Type":"ContainerDied","Data":"87885ae6b97823f2265aeba9f16a62d9488d237182d9d1b42f2afe2d43f98260"} Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.155402 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ff8b8ffdf-4dwxk" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.155433 4717 scope.go:117] "RemoveContainer" containerID="86c54231ad6bf38342e808fd0a76f35e200e9ceaff8aa1c46db4a352090a3b06" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.157920 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5d497dbf-bkvz6" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.157937 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5d497dbf-bkvz6" event={"ID":"09f86cea-494d-4f11-b9f5-2045f7aabd92","Type":"ContainerDied","Data":"5398f26e45c766dbab1cd4b70b92b07420e8879318c1c410c2e94aeffc461af3"} Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.162675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerStarted","Data":"ba18616baa2d306c15338c2a8a7f166810b5e29106d040f5a51056a9756e12f5"} Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.162731 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerStarted","Data":"de0cb7e39543de7006a2ae836bb051cd38937afd2157c2960fc87f196e2d795c"} Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.162776 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="cinder-scheduler" containerID="cri-o://1941e3ed1896ccd75f5724e696cdcba44836764c82a487e4a23ca3fb61898ab0" gracePeriod=30 Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.162894 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="probe" containerID="cri-o://8a034c2b39ce2ff13c6f05c5bb37b400cbed75ccf99d3f32374508ab4aaf231a" gracePeriod=30 Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.254235 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c6f587885-vvp7f"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.273924 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c6f587885-vvp7f"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.299946 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b5d497dbf-bkvz6"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.327908 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b5d497dbf-bkvz6"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.350603 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ff8b8ffdf-4dwxk"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.384178 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7ff8b8ffdf-4dwxk"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.400318 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-hqhkj"] Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.400555 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerName="dnsmasq-dns" containerID="cri-o://43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed" gracePeriod=10 Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.402159 4717 scope.go:117] "RemoveContainer" containerID="8543a25dbf077d192eea9219229f62802d037cf29feaa2444a1507cb3cc6e9e7" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.432620 4717 scope.go:117] "RemoveContainer" containerID="e150b7b66dcbe638973e20a725f834dd974991b8313f7237c6399a518093e9e7" Feb 21 22:03:32 crc kubenswrapper[4717]: I0221 22:03:32.678114 4717 scope.go:117] "RemoveContainer" containerID="208618fb7dba17474af74671a8910f21495f0c58490ed4c913725f2d1bc0de38" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.105891 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.171412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-swift-storage-0\") pod \"92156922-8dc6-4d2f-9a66-92c8f049374c\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.171497 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-sb\") pod \"92156922-8dc6-4d2f-9a66-92c8f049374c\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.171527 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-config\") pod \"92156922-8dc6-4d2f-9a66-92c8f049374c\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.171670 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-nb\") pod \"92156922-8dc6-4d2f-9a66-92c8f049374c\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.171750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvsjf\" (UniqueName: \"kubernetes.io/projected/92156922-8dc6-4d2f-9a66-92c8f049374c-kube-api-access-dvsjf\") pod \"92156922-8dc6-4d2f-9a66-92c8f049374c\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.171804 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-svc\") pod \"92156922-8dc6-4d2f-9a66-92c8f049374c\" (UID: \"92156922-8dc6-4d2f-9a66-92c8f049374c\") " Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.219896 4717 generic.go:334] "Generic (PLEG): container finished" podID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerID="43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed" exitCode=0 Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.219980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" event={"ID":"92156922-8dc6-4d2f-9a66-92c8f049374c","Type":"ContainerDied","Data":"43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed"} Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.220011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" event={"ID":"92156922-8dc6-4d2f-9a66-92c8f049374c","Type":"ContainerDied","Data":"97d747abb4485011abdc6d38de9ff90cef2778ae2a730e8b0b8b853037b830ec"} Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.220037 4717 scope.go:117] "RemoveContainer" containerID="43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.220181 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-hqhkj" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.235791 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92156922-8dc6-4d2f-9a66-92c8f049374c-kube-api-access-dvsjf" (OuterVolumeSpecName: "kube-api-access-dvsjf") pod "92156922-8dc6-4d2f-9a66-92c8f049374c" (UID: "92156922-8dc6-4d2f-9a66-92c8f049374c"). InnerVolumeSpecName "kube-api-access-dvsjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.257002 4717 scope.go:117] "RemoveContainer" containerID="32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.269548 4717 generic.go:334] "Generic (PLEG): container finished" podID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerID="8a034c2b39ce2ff13c6f05c5bb37b400cbed75ccf99d3f32374508ab4aaf231a" exitCode=0 Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.269934 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b","Type":"ContainerDied","Data":"8a034c2b39ce2ff13c6f05c5bb37b400cbed75ccf99d3f32374508ab4aaf231a"} Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.274251 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvsjf\" (UniqueName: \"kubernetes.io/projected/92156922-8dc6-4d2f-9a66-92c8f049374c-kube-api-access-dvsjf\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.294721 4717 scope.go:117] "RemoveContainer" containerID="43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed" Feb 21 22:03:33 crc kubenswrapper[4717]: E0221 22:03:33.296854 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed\": container with ID starting with 43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed not found: ID does not exist" containerID="43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.296920 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed"} err="failed to get container status \"43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed\": rpc error: code = NotFound desc = could not find container \"43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed\": container with ID starting with 43f6e3127cebbc2bdc2e9dd347aba02fc31ecc713979b91ab03c50f957e619ed not found: ID does not exist" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.296945 4717 scope.go:117] "RemoveContainer" containerID="32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f" Feb 21 22:03:33 crc kubenswrapper[4717]: E0221 22:03:33.300209 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f\": container with ID starting with 32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f not found: ID does not exist" containerID="32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.300258 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f"} err="failed to get container status \"32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f\": rpc error: code = NotFound desc = could not find container \"32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f\": container with ID starting with 32d7bc1dc16c91f904f341c512358a0536ee023dcfc922157a3e6cd93c18587f not found: ID does not exist" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.428146 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-config" (OuterVolumeSpecName: "config") pod "92156922-8dc6-4d2f-9a66-92c8f049374c" (UID: "92156922-8dc6-4d2f-9a66-92c8f049374c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.441748 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92156922-8dc6-4d2f-9a66-92c8f049374c" (UID: "92156922-8dc6-4d2f-9a66-92c8f049374c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.452431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92156922-8dc6-4d2f-9a66-92c8f049374c" (UID: "92156922-8dc6-4d2f-9a66-92c8f049374c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.455645 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92156922-8dc6-4d2f-9a66-92c8f049374c" (UID: "92156922-8dc6-4d2f-9a66-92c8f049374c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.467327 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92156922-8dc6-4d2f-9a66-92c8f049374c" (UID: "92156922-8dc6-4d2f-9a66-92c8f049374c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.477942 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.477969 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.477979 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.477987 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.477995 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92156922-8dc6-4d2f-9a66-92c8f049374c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.568261 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-hqhkj"] Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.574998 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-hqhkj"] Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.591027 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-86b8dffbf6-mztpd" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.638699 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-669df94976-tmfpb"] Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.638965 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon-log" containerID="cri-o://3e81fcad833c7e7a7f433096a45c795ab83d621d5970448361c95c6961841ed0" gracePeriod=30 Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.639089 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" containerID="cri-o://81ea984b703415ccda6d6772c5e00a9f9a88a1f920ae9fd881bd21238a2548b2" gracePeriod=30 Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.660025 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.834966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.988566 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" path="/var/lib/kubelet/pods/09f86cea-494d-4f11-b9f5-2045f7aabd92/volumes" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.989809 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" path="/var/lib/kubelet/pods/439eab0e-0489-4a97-993e-c6c3df03e694/volumes" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.990417 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" path="/var/lib/kubelet/pods/92156922-8dc6-4d2f-9a66-92c8f049374c/volumes" Feb 21 22:03:33 crc kubenswrapper[4717]: I0221 22:03:33.991518 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" path="/var/lib/kubelet/pods/ea65478f-2244-4670-9821-526d00eb1b9a/volumes" Feb 21 22:03:34 crc kubenswrapper[4717]: I0221 22:03:34.285489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerStarted","Data":"91f2afbd1f8d214e6c8cdb70cbbde386147649062e753feb93f24ca57347cfae"} Feb 21 22:03:34 crc kubenswrapper[4717]: I0221 22:03:34.295065 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:35 crc kubenswrapper[4717]: I0221 22:03:35.294271 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerStarted","Data":"984d0d934dc14e4af01410f368a4e033baff48094ef3b6a56c4571b86abc662e"} Feb 21 22:03:35 crc kubenswrapper[4717]: I0221 22:03:35.391052 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7cc86c585-7kk6w" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.333287 4717 generic.go:334] "Generic (PLEG): container finished" podID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerID="1941e3ed1896ccd75f5724e696cdcba44836764c82a487e4a23ca3fb61898ab0" exitCode=0 Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.333631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b","Type":"ContainerDied","Data":"1941e3ed1896ccd75f5724e696cdcba44836764c82a487e4a23ca3fb61898ab0"} Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.616029 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.681209 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.716784 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.731513 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ccp\" (UniqueName: \"kubernetes.io/projected/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-kube-api-access-48ccp\") pod \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.731845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-etc-machine-id\") pod \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.731974 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-combined-ca-bundle\") pod \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.732055 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-scripts\") pod \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.732175 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data\") pod \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.732298 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data-custom\") pod \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\" (UID: \"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b\") " Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.734310 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" (UID: "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.744293 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" (UID: "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.744651 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-kube-api-access-48ccp" (OuterVolumeSpecName: "kube-api-access-48ccp") pod "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" (UID: "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b"). InnerVolumeSpecName "kube-api-access-48ccp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.754960 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-scripts" (OuterVolumeSpecName: "scripts") pod "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" (UID: "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.812912 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:34322->10.217.0.148:8443: read: connection reset by peer" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.838060 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48ccp\" (UniqueName: \"kubernetes.io/projected/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-kube-api-access-48ccp\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.838089 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.838099 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.838108 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.898970 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" (UID: "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.934068 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data" (OuterVolumeSpecName: "config-data") pod "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" (UID: "90a47ccd-8d81-441e-aa32-a1c8d6d7af8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.943190 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:36 crc kubenswrapper[4717]: I0221 22:03:36.943281 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.014119 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f5bdf5f76-9rjst"] Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016161 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016175 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016193 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016212 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerName="init" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016221 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerName="init" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016235 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016243 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016253 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016259 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016275 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="probe" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016281 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="probe" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016300 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016307 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016318 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016324 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016333 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="cinder-scheduler" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016338 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="cinder-scheduler" Feb 21 22:03:37 crc kubenswrapper[4717]: E0221 22:03:37.016348 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerName="dnsmasq-dns" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016354 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerName="dnsmasq-dns" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016506 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="probe" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016519 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016531 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016544 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="92156922-8dc6-4d2f-9a66-92c8f049374c" containerName="dnsmasq-dns" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016555 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" containerName="cinder-scheduler" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016562 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016573 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea65478f-2244-4670-9821-526d00eb1b9a" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016582 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="439eab0e-0489-4a97-993e-c6c3df03e694" containerName="horizon-log" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.016591 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f86cea-494d-4f11-b9f5-2045f7aabd92" containerName="horizon" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.017509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.027063 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f5bdf5f76-9rjst"] Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146431 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-config-data\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146498 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm97c\" (UniqueName: \"kubernetes.io/projected/131cec55-efe0-49f9-ad5e-cfbca687c941-kube-api-access-fm97c\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-combined-ca-bundle\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-scripts\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146619 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131cec55-efe0-49f9-ad5e-cfbca687c941-logs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146641 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-internal-tls-certs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.146722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-public-tls-certs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.248468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm97c\" (UniqueName: \"kubernetes.io/projected/131cec55-efe0-49f9-ad5e-cfbca687c941-kube-api-access-fm97c\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.248804 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-combined-ca-bundle\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.248937 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-scripts\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.249040 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131cec55-efe0-49f9-ad5e-cfbca687c941-logs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.249173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-internal-tls-certs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.249346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-public-tls-certs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.249441 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-config-data\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.249889 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/131cec55-efe0-49f9-ad5e-cfbca687c941-logs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.254399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-scripts\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.254517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-internal-tls-certs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.255615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-config-data\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.256478 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-public-tls-certs\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.256587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131cec55-efe0-49f9-ad5e-cfbca687c941-combined-ca-bundle\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.266640 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm97c\" (UniqueName: \"kubernetes.io/projected/131cec55-efe0-49f9-ad5e-cfbca687c941-kube-api-access-fm97c\") pod \"placement-7f5bdf5f76-9rjst\" (UID: \"131cec55-efe0-49f9-ad5e-cfbca687c941\") " pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.332751 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.343032 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.344056 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90a47ccd-8d81-441e-aa32-a1c8d6d7af8b","Type":"ContainerDied","Data":"3e55860e3ba9b287f1b49e50cc81cc20d9e77ca9c7aa32875c34058a210132ce"} Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.344180 4717 scope.go:117] "RemoveContainer" containerID="8a034c2b39ce2ff13c6f05c5bb37b400cbed75ccf99d3f32374508ab4aaf231a" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.344951 4717 generic.go:334] "Generic (PLEG): container finished" podID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerID="81ea984b703415ccda6d6772c5e00a9f9a88a1f920ae9fd881bd21238a2548b2" exitCode=0 Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.345008 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-669df94976-tmfpb" event={"ID":"7af1cf64-7044-4170-9ba4-bcc17d97cbb2","Type":"ContainerDied","Data":"81ea984b703415ccda6d6772c5e00a9f9a88a1f920ae9fd881bd21238a2548b2"} Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.352916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerStarted","Data":"a1eeefa03a4ffad553ca02b4401d4f4714edf9bb2001de411c5257f93d89eeb1"} Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.394539 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.213233736 podStartE2EDuration="7.394523069s" podCreationTimestamp="2026-02-21 22:03:30 +0000 UTC" firstStartedPulling="2026-02-21 22:03:31.381293109 +0000 UTC m=+1026.162826731" lastFinishedPulling="2026-02-21 22:03:36.562582442 +0000 UTC m=+1031.344116064" observedRunningTime="2026-02-21 22:03:37.380374739 +0000 UTC m=+1032.161908361" watchObservedRunningTime="2026-02-21 22:03:37.394523069 +0000 UTC m=+1032.176056691" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.460584 4717 scope.go:117] "RemoveContainer" containerID="1941e3ed1896ccd75f5724e696cdcba44836764c82a487e4a23ca3fb61898ab0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.467441 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.482994 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.497911 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.499420 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.502873 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.523387 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.554953 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.555015 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7ww\" (UniqueName: \"kubernetes.io/projected/9955b361-63c8-42bb-9efc-7ab0b3150904-kube-api-access-gn7ww\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.555048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-scripts\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.555116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-config-data\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.555171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9955b361-63c8-42bb-9efc-7ab0b3150904-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.555195 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.656218 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9955b361-63c8-42bb-9efc-7ab0b3150904-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.656491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.656561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.656592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7ww\" (UniqueName: \"kubernetes.io/projected/9955b361-63c8-42bb-9efc-7ab0b3150904-kube-api-access-gn7ww\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.656626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-scripts\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.656687 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-config-data\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.659576 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9955b361-63c8-42bb-9efc-7ab0b3150904-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.662733 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-config-data\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.664393 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.664428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-scripts\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.664843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9955b361-63c8-42bb-9efc-7ab0b3150904-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.678262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7ww\" (UniqueName: \"kubernetes.io/projected/9955b361-63c8-42bb-9efc-7ab0b3150904-kube-api-access-gn7ww\") pod \"cinder-scheduler-0\" (UID: \"9955b361-63c8-42bb-9efc-7ab0b3150904\") " pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.847289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 21 22:03:37 crc kubenswrapper[4717]: I0221 22:03:37.929636 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f5bdf5f76-9rjst"] Feb 21 22:03:38 crc kubenswrapper[4717]: I0221 22:03:38.034256 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a47ccd-8d81-441e-aa32-a1c8d6d7af8b" path="/var/lib/kubelet/pods/90a47ccd-8d81-441e-aa32-a1c8d6d7af8b/volumes" Feb 21 22:03:38 crc kubenswrapper[4717]: W0221 22:03:38.356221 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9955b361_63c8_42bb_9efc_7ab0b3150904.slice/crio-3b0dd68b7964d55eee6c81a216917e04effdcdd14b1063645398c892e646b611 WatchSource:0}: Error finding container 3b0dd68b7964d55eee6c81a216917e04effdcdd14b1063645398c892e646b611: Status 404 returned error can't find the container with id 3b0dd68b7964d55eee6c81a216917e04effdcdd14b1063645398c892e646b611 Feb 21 22:03:38 crc kubenswrapper[4717]: I0221 22:03:38.356544 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 21 22:03:38 crc kubenswrapper[4717]: I0221 22:03:38.365306 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5bdf5f76-9rjst" event={"ID":"131cec55-efe0-49f9-ad5e-cfbca687c941","Type":"ContainerStarted","Data":"771e43167ce0122513b26b5034e2333eecc5f366fc52337a157b72385253949a"} Feb 21 22:03:38 crc kubenswrapper[4717]: I0221 22:03:38.365357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5bdf5f76-9rjst" event={"ID":"131cec55-efe0-49f9-ad5e-cfbca687c941","Type":"ContainerStarted","Data":"eb77234f9b02e5a0cbf30c32aabe85f32644645ce62f0153ed2661aeca3f9fe1"} Feb 21 22:03:38 crc kubenswrapper[4717]: I0221 22:03:38.365414 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 21 22:03:38 crc kubenswrapper[4717]: I0221 22:03:38.846281 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.183404 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.302666 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-df9cd9b6-x9br7" Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.377781 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-578d8fcbf6-jg7l4"] Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.378427 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-578d8fcbf6-jg7l4" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api-log" containerID="cri-o://3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631" gracePeriod=30 Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.378580 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-578d8fcbf6-jg7l4" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api" containerID="cri-o://ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63" gracePeriod=30 Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.392182 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9955b361-63c8-42bb-9efc-7ab0b3150904","Type":"ContainerStarted","Data":"30d42cd5f4768b69350f33e504612499f9b83995469df50ee12acb0f70ec29cd"} Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.392223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9955b361-63c8-42bb-9efc-7ab0b3150904","Type":"ContainerStarted","Data":"3b0dd68b7964d55eee6c81a216917e04effdcdd14b1063645398c892e646b611"} Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.399308 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5bdf5f76-9rjst" event={"ID":"131cec55-efe0-49f9-ad5e-cfbca687c941","Type":"ContainerStarted","Data":"c4d5ca24a396c3e12d9414c26f2e5a7013fed7dea98a4ed0cf099898c9c361f3"} Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.399602 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.400162 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:03:39 crc kubenswrapper[4717]: I0221 22:03:39.431005 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f5bdf5f76-9rjst" podStartSLOduration=3.43098978 podStartE2EDuration="3.43098978s" podCreationTimestamp="2026-02-21 22:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:39.422741573 +0000 UTC m=+1034.204275195" watchObservedRunningTime="2026-02-21 22:03:39.43098978 +0000 UTC m=+1034.212523402" Feb 21 22:03:40 crc kubenswrapper[4717]: I0221 22:03:40.005893 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 21 22:03:40 crc kubenswrapper[4717]: I0221 22:03:40.405592 4717 generic.go:334] "Generic (PLEG): container finished" podID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerID="3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631" exitCode=143 Feb 21 22:03:40 crc kubenswrapper[4717]: I0221 22:03:40.405664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578d8fcbf6-jg7l4" event={"ID":"33f84b4a-a654-4453-b1cf-c18ae33fd406","Type":"ContainerDied","Data":"3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631"} Feb 21 22:03:40 crc kubenswrapper[4717]: I0221 22:03:40.408788 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9955b361-63c8-42bb-9efc-7ab0b3150904","Type":"ContainerStarted","Data":"37c4914947167c3d3fe0df69a498d295d2f08d8d6d93ce3f4520c1e950439acf"} Feb 21 22:03:40 crc kubenswrapper[4717]: I0221 22:03:40.435046 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.435025011 podStartE2EDuration="3.435025011s" podCreationTimestamp="2026-02-21 22:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:40.425611155 +0000 UTC m=+1035.207144777" watchObservedRunningTime="2026-02-21 22:03:40.435025011 +0000 UTC m=+1035.216558633" Feb 21 22:03:42 crc kubenswrapper[4717]: I0221 22:03:42.552535 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-578d8fcbf6-jg7l4" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:42736->10.217.0.167:9311: read: connection reset by peer" Feb 21 22:03:42 crc kubenswrapper[4717]: I0221 22:03:42.552572 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-578d8fcbf6-jg7l4" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:42730->10.217.0.167:9311: read: connection reset by peer" Feb 21 22:03:42 crc kubenswrapper[4717]: I0221 22:03:42.847496 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 21 22:03:42 crc kubenswrapper[4717]: I0221 22:03:42.953906 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.097155 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmqpr\" (UniqueName: \"kubernetes.io/projected/33f84b4a-a654-4453-b1cf-c18ae33fd406-kube-api-access-vmqpr\") pod \"33f84b4a-a654-4453-b1cf-c18ae33fd406\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.097516 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f84b4a-a654-4453-b1cf-c18ae33fd406-logs\") pod \"33f84b4a-a654-4453-b1cf-c18ae33fd406\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.097580 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-combined-ca-bundle\") pod \"33f84b4a-a654-4453-b1cf-c18ae33fd406\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.097691 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data-custom\") pod \"33f84b4a-a654-4453-b1cf-c18ae33fd406\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.097750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data\") pod \"33f84b4a-a654-4453-b1cf-c18ae33fd406\" (UID: \"33f84b4a-a654-4453-b1cf-c18ae33fd406\") " Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.100480 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f84b4a-a654-4453-b1cf-c18ae33fd406-logs" (OuterVolumeSpecName: "logs") pod "33f84b4a-a654-4453-b1cf-c18ae33fd406" (UID: "33f84b4a-a654-4453-b1cf-c18ae33fd406"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.115169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "33f84b4a-a654-4453-b1cf-c18ae33fd406" (UID: "33f84b4a-a654-4453-b1cf-c18ae33fd406"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.119214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f84b4a-a654-4453-b1cf-c18ae33fd406-kube-api-access-vmqpr" (OuterVolumeSpecName: "kube-api-access-vmqpr") pod "33f84b4a-a654-4453-b1cf-c18ae33fd406" (UID: "33f84b4a-a654-4453-b1cf-c18ae33fd406"). InnerVolumeSpecName "kube-api-access-vmqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.145591 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33f84b4a-a654-4453-b1cf-c18ae33fd406" (UID: "33f84b4a-a654-4453-b1cf-c18ae33fd406"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.171520 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data" (OuterVolumeSpecName: "config-data") pod "33f84b4a-a654-4453-b1cf-c18ae33fd406" (UID: "33f84b4a-a654-4453-b1cf-c18ae33fd406"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.200646 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmqpr\" (UniqueName: \"kubernetes.io/projected/33f84b4a-a654-4453-b1cf-c18ae33fd406-kube-api-access-vmqpr\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.200697 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f84b4a-a654-4453-b1cf-c18ae33fd406-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.200751 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.200765 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.200777 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f84b4a-a654-4453-b1cf-c18ae33fd406-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.438982 4717 generic.go:334] "Generic (PLEG): container finished" podID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerID="ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63" exitCode=0 Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.439057 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-578d8fcbf6-jg7l4" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.439039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578d8fcbf6-jg7l4" event={"ID":"33f84b4a-a654-4453-b1cf-c18ae33fd406","Type":"ContainerDied","Data":"ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63"} Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.440517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-578d8fcbf6-jg7l4" event={"ID":"33f84b4a-a654-4453-b1cf-c18ae33fd406","Type":"ContainerDied","Data":"3ce4190a09bac64fd14bbeeb70c39e8a4fb40dfaae0db1d8e649c3b638f962aa"} Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.440587 4717 scope.go:117] "RemoveContainer" containerID="ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.465923 4717 scope.go:117] "RemoveContainer" containerID="3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.487703 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-578d8fcbf6-jg7l4"] Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.498034 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-578d8fcbf6-jg7l4"] Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.513431 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-86d46bb596-pj8cr" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.514627 4717 scope.go:117] "RemoveContainer" containerID="ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63" Feb 21 22:03:43 crc kubenswrapper[4717]: E0221 22:03:43.515317 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63\": container with ID starting with ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63 not found: ID does not exist" containerID="ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.515394 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63"} err="failed to get container status \"ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63\": rpc error: code = NotFound desc = could not find container \"ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63\": container with ID starting with ae73ebaaa6364e4b0a644a8f646615e548e39ddfa298cf644a3ae033aec93f63 not found: ID does not exist" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.515449 4717 scope.go:117] "RemoveContainer" containerID="3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631" Feb 21 22:03:43 crc kubenswrapper[4717]: E0221 22:03:43.516341 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631\": container with ID starting with 3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631 not found: ID does not exist" containerID="3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.516444 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631"} err="failed to get container status \"3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631\": rpc error: code = NotFound desc = could not find container \"3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631\": container with ID starting with 3dc2fd4e57614c4ce0d3977a3abc762fa5a3541a7d6ee321dc08ad77aed8e631 not found: ID does not exist" Feb 21 22:03:43 crc kubenswrapper[4717]: I0221 22:03:43.995822 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" path="/var/lib/kubelet/pods/33f84b4a-a654-4453-b1cf-c18ae33fd406/volumes" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.655465 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 21 22:03:47 crc kubenswrapper[4717]: E0221 22:03:47.656355 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api-log" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.656379 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api-log" Feb 21 22:03:47 crc kubenswrapper[4717]: E0221 22:03:47.656397 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.656403 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.656586 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.656616 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f84b4a-a654-4453-b1cf-c18ae33fd406" containerName="barbican-api-log" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.657170 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.661410 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rwdck" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.661828 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.665940 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.676797 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.792493 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/756781b7-938f-4654-8109-725420287d7b-openstack-config\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.792886 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9kg\" (UniqueName: \"kubernetes.io/projected/756781b7-938f-4654-8109-725420287d7b-kube-api-access-bb9kg\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.792951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756781b7-938f-4654-8109-725420287d7b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.792983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/756781b7-938f-4654-8109-725420287d7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.895466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/756781b7-938f-4654-8109-725420287d7b-openstack-config\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.895638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9kg\" (UniqueName: \"kubernetes.io/projected/756781b7-938f-4654-8109-725420287d7b-kube-api-access-bb9kg\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.895750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756781b7-938f-4654-8109-725420287d7b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.895789 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/756781b7-938f-4654-8109-725420287d7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.896822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/756781b7-938f-4654-8109-725420287d7b-openstack-config\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.903954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/756781b7-938f-4654-8109-725420287d7b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.907549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/756781b7-938f-4654-8109-725420287d7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.925986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9kg\" (UniqueName: \"kubernetes.io/projected/756781b7-938f-4654-8109-725420287d7b-kube-api-access-bb9kg\") pod \"openstackclient\" (UID: \"756781b7-938f-4654-8109-725420287d7b\") " pod="openstack/openstackclient" Feb 21 22:03:47 crc kubenswrapper[4717]: I0221 22:03:47.973962 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 21 22:03:48 crc kubenswrapper[4717]: I0221 22:03:48.102804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 21 22:03:48 crc kubenswrapper[4717]: I0221 22:03:48.446607 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 21 22:03:48 crc kubenswrapper[4717]: I0221 22:03:48.504958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"756781b7-938f-4654-8109-725420287d7b","Type":"ContainerStarted","Data":"86b869964d86b279c57e2b291fb82a31693875be04f7145c4d00103fd7702079"} Feb 21 22:03:48 crc kubenswrapper[4717]: I0221 22:03:48.846322 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.165848 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c4dc8df6c-b88lw"] Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.167793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.170601 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.170838 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.171243 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.185154 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c4dc8df6c-b88lw"] Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vzf\" (UniqueName: \"kubernetes.io/projected/b1949ec1-5153-4003-b960-68a8f126b72d-kube-api-access-j2vzf\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268656 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1949ec1-5153-4003-b960-68a8f126b72d-run-httpd\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-internal-tls-certs\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-public-tls-certs\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1949ec1-5153-4003-b960-68a8f126b72d-log-httpd\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-combined-ca-bundle\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268818 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b1949ec1-5153-4003-b960-68a8f126b72d-etc-swift\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.268840 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-config-data\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.370416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vzf\" (UniqueName: \"kubernetes.io/projected/b1949ec1-5153-4003-b960-68a8f126b72d-kube-api-access-j2vzf\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.370480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1949ec1-5153-4003-b960-68a8f126b72d-run-httpd\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.370514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-internal-tls-certs\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.370542 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-public-tls-certs\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.370571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1949ec1-5153-4003-b960-68a8f126b72d-log-httpd\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.371348 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1949ec1-5153-4003-b960-68a8f126b72d-run-httpd\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.371519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1949ec1-5153-4003-b960-68a8f126b72d-log-httpd\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.371717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-combined-ca-bundle\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.371769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b1949ec1-5153-4003-b960-68a8f126b72d-etc-swift\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.371810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-config-data\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.378151 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b1949ec1-5153-4003-b960-68a8f126b72d-etc-swift\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.378285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-config-data\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.385476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-combined-ca-bundle\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.386164 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-public-tls-certs\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.386702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1949ec1-5153-4003-b960-68a8f126b72d-internal-tls-certs\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.387601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vzf\" (UniqueName: \"kubernetes.io/projected/b1949ec1-5153-4003-b960-68a8f126b72d-kube-api-access-j2vzf\") pod \"swift-proxy-5c4dc8df6c-b88lw\" (UID: \"b1949ec1-5153-4003-b960-68a8f126b72d\") " pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:51 crc kubenswrapper[4717]: I0221 22:03:51.492025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.044689 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c4dc8df6c-b88lw"] Feb 21 22:03:52 crc kubenswrapper[4717]: W0221 22:03:52.048689 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1949ec1_5153_4003_b960_68a8f126b72d.slice/crio-61653239e989e4ed89f7b20e3ee0b50c253b54fd3939e0f803e04b1ff0d06fa9 WatchSource:0}: Error finding container 61653239e989e4ed89f7b20e3ee0b50c253b54fd3939e0f803e04b1ff0d06fa9: Status 404 returned error can't find the container with id 61653239e989e4ed89f7b20e3ee0b50c253b54fd3939e0f803e04b1ff0d06fa9 Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.551334 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" event={"ID":"b1949ec1-5153-4003-b960-68a8f126b72d","Type":"ContainerStarted","Data":"61653239e989e4ed89f7b20e3ee0b50c253b54fd3939e0f803e04b1ff0d06fa9"} Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.975835 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.976275 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-central-agent" containerID="cri-o://ba18616baa2d306c15338c2a8a7f166810b5e29106d040f5a51056a9756e12f5" gracePeriod=30 Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.977622 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="proxy-httpd" containerID="cri-o://a1eeefa03a4ffad553ca02b4401d4f4714edf9bb2001de411c5257f93d89eeb1" gracePeriod=30 Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.977725 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="sg-core" containerID="cri-o://984d0d934dc14e4af01410f368a4e033baff48094ef3b6a56c4571b86abc662e" gracePeriod=30 Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.977794 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-notification-agent" containerID="cri-o://91f2afbd1f8d214e6c8cdb70cbbde386147649062e753feb93f24ca57347cfae" gracePeriod=30 Feb 21 22:03:52 crc kubenswrapper[4717]: I0221 22:03:52.981811 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 21 22:03:53 crc kubenswrapper[4717]: I0221 22:03:53.564754 4717 generic.go:334] "Generic (PLEG): container finished" podID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerID="a1eeefa03a4ffad553ca02b4401d4f4714edf9bb2001de411c5257f93d89eeb1" exitCode=0 Feb 21 22:03:53 crc kubenswrapper[4717]: I0221 22:03:53.564793 4717 generic.go:334] "Generic (PLEG): container finished" podID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerID="984d0d934dc14e4af01410f368a4e033baff48094ef3b6a56c4571b86abc662e" exitCode=2 Feb 21 22:03:53 crc kubenswrapper[4717]: I0221 22:03:53.564804 4717 generic.go:334] "Generic (PLEG): container finished" podID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerID="ba18616baa2d306c15338c2a8a7f166810b5e29106d040f5a51056a9756e12f5" exitCode=0 Feb 21 22:03:53 crc kubenswrapper[4717]: I0221 22:03:53.564826 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerDied","Data":"a1eeefa03a4ffad553ca02b4401d4f4714edf9bb2001de411c5257f93d89eeb1"} Feb 21 22:03:53 crc kubenswrapper[4717]: I0221 22:03:53.564875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerDied","Data":"984d0d934dc14e4af01410f368a4e033baff48094ef3b6a56c4571b86abc662e"} Feb 21 22:03:53 crc kubenswrapper[4717]: I0221 22:03:53.564892 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerDied","Data":"ba18616baa2d306c15338c2a8a7f166810b5e29106d040f5a51056a9756e12f5"} Feb 21 22:03:55 crc kubenswrapper[4717]: I0221 22:03:55.586136 4717 generic.go:334] "Generic (PLEG): container finished" podID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerID="91f2afbd1f8d214e6c8cdb70cbbde386147649062e753feb93f24ca57347cfae" exitCode=0 Feb 21 22:03:55 crc kubenswrapper[4717]: I0221 22:03:55.586334 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerDied","Data":"91f2afbd1f8d214e6c8cdb70cbbde386147649062e753feb93f24ca57347cfae"} Feb 21 22:03:58 crc kubenswrapper[4717]: I0221 22:03:58.348509 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:03:58 crc kubenswrapper[4717]: I0221 22:03:58.349017 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" containerName="kube-state-metrics" containerID="cri-o://69448ec46413811c070cdb37b272c6eeaec778e1fd6a5eca434201575ec96f11" gracePeriod=30 Feb 21 22:03:58 crc kubenswrapper[4717]: I0221 22:03:58.510737 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": dial tcp 10.217.0.100:8081: connect: connection refused" Feb 21 22:03:58 crc kubenswrapper[4717]: I0221 22:03:58.621442 4717 generic.go:334] "Generic (PLEG): container finished" podID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" containerID="69448ec46413811c070cdb37b272c6eeaec778e1fd6a5eca434201575ec96f11" exitCode=2 Feb 21 22:03:58 crc kubenswrapper[4717]: I0221 22:03:58.621525 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2e08bfd4-7ef2-4895-89e2-c9265d0adc13","Type":"ContainerDied","Data":"69448ec46413811c070cdb37b272c6eeaec778e1fd6a5eca434201575ec96f11"} Feb 21 22:03:58 crc kubenswrapper[4717]: I0221 22:03:58.848600 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-669df94976-tmfpb" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.107597 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.212521 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gvjrw"] Feb 21 22:03:59 crc kubenswrapper[4717]: E0221 22:03:59.212990 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="sg-core" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.213008 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="sg-core" Feb 21 22:03:59 crc kubenswrapper[4717]: E0221 22:03:59.213029 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-central-agent" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.213038 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-central-agent" Feb 21 22:03:59 crc kubenswrapper[4717]: E0221 22:03:59.213049 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-notification-agent" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.213056 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-notification-agent" Feb 21 22:03:59 crc kubenswrapper[4717]: E0221 22:03:59.213123 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="proxy-httpd" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.213132 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="proxy-httpd" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.213331 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-central-agent" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.213355 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="sg-core" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.214244 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="ceilometer-notification-agent" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.214290 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" containerName="proxy-httpd" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.215187 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217255 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-sg-core-conf-yaml\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwnwt\" (UniqueName: \"kubernetes.io/projected/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-kube-api-access-xwnwt\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217365 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-log-httpd\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-scripts\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217515 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-run-httpd\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217569 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-combined-ca-bundle\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.217624 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-config-data\") pod \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\" (UID: \"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.218563 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.219014 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.222992 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-scripts" (OuterVolumeSpecName: "scripts") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.225953 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-kube-api-access-xwnwt" (OuterVolumeSpecName: "kube-api-access-xwnwt") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "kube-api-access-xwnwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.226009 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.238838 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gvjrw"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.298171 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.321313 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbss2\" (UniqueName: \"kubernetes.io/projected/2e08bfd4-7ef2-4895-89e2-c9265d0adc13-kube-api-access-gbss2\") pod \"2e08bfd4-7ef2-4895-89e2-c9265d0adc13\" (UID: \"2e08bfd4-7ef2-4895-89e2-c9265d0adc13\") " Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.321698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86vkt\" (UniqueName: \"kubernetes.io/projected/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-kube-api-access-86vkt\") pod \"nova-api-db-create-gvjrw\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.321767 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-operator-scripts\") pod \"nova-api-db-create-gvjrw\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.322248 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.322277 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.322306 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwnwt\" (UniqueName: \"kubernetes.io/projected/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-kube-api-access-xwnwt\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.322315 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.322324 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.325019 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08bfd4-7ef2-4895-89e2-c9265d0adc13-kube-api-access-gbss2" (OuterVolumeSpecName: "kube-api-access-gbss2") pod "2e08bfd4-7ef2-4895-89e2-c9265d0adc13" (UID: "2e08bfd4-7ef2-4895-89e2-c9265d0adc13"). InnerVolumeSpecName "kube-api-access-gbss2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.328418 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vvzkx"] Feb 21 22:03:59 crc kubenswrapper[4717]: E0221 22:03:59.329001 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" containerName="kube-state-metrics" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.329083 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" containerName="kube-state-metrics" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.329366 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" containerName="kube-state-metrics" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.330133 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.337750 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-dbfd-account-create-update-lhmrn"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.339121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.344883 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.349299 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vvzkx"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.373511 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dbfd-account-create-update-lhmrn"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.424818 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj89x\" (UniqueName: \"kubernetes.io/projected/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-kube-api-access-lj89x\") pod \"nova-api-dbfd-account-create-update-lhmrn\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.425566 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.433211 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86vkt\" (UniqueName: \"kubernetes.io/projected/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-kube-api-access-86vkt\") pod \"nova-api-db-create-gvjrw\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.433328 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-operator-scripts\") pod \"nova-api-db-create-gvjrw\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.435288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d34ce70-aef8-44bc-872a-be96892f145f-operator-scripts\") pod \"nova-cell0-db-create-vvzkx\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.435383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-operator-scripts\") pod \"nova-api-dbfd-account-create-update-lhmrn\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.435639 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjd97\" (UniqueName: \"kubernetes.io/projected/7d34ce70-aef8-44bc-872a-be96892f145f-kube-api-access-jjd97\") pod \"nova-cell0-db-create-vvzkx\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.435817 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.435830 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbss2\" (UniqueName: \"kubernetes.io/projected/2e08bfd4-7ef2-4895-89e2-c9265d0adc13-kube-api-access-gbss2\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.438147 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-operator-scripts\") pod \"nova-api-db-create-gvjrw\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.441980 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-config-data" (OuterVolumeSpecName: "config-data") pod "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" (UID: "df6bc00e-61fa-43eb-960b-2a1fa5d1d6df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.446735 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wl5ms"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.447970 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.461985 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86vkt\" (UniqueName: \"kubernetes.io/projected/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-kube-api-access-86vkt\") pod \"nova-api-db-create-gvjrw\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.485585 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wl5ms"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.536818 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b797-account-create-update-mfrb6"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537563 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjd97\" (UniqueName: \"kubernetes.io/projected/7d34ce70-aef8-44bc-872a-be96892f145f-kube-api-access-jjd97\") pod \"nova-cell0-db-create-vvzkx\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537662 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj89x\" (UniqueName: \"kubernetes.io/projected/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-kube-api-access-lj89x\") pod \"nova-api-dbfd-account-create-update-lhmrn\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ecbe558-f60c-4be1-8dbb-35110ba64185-operator-scripts\") pod \"nova-cell1-db-create-wl5ms\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d34ce70-aef8-44bc-872a-be96892f145f-operator-scripts\") pod \"nova-cell0-db-create-vvzkx\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-operator-scripts\") pod \"nova-api-dbfd-account-create-update-lhmrn\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75nd\" (UniqueName: \"kubernetes.io/projected/8ecbe558-f60c-4be1-8dbb-35110ba64185-kube-api-access-c75nd\") pod \"nova-cell1-db-create-wl5ms\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.537903 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.538048 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.538696 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d34ce70-aef8-44bc-872a-be96892f145f-operator-scripts\") pod \"nova-cell0-db-create-vvzkx\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.538750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-operator-scripts\") pod \"nova-api-dbfd-account-create-update-lhmrn\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.539902 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.553605 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.555519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj89x\" (UniqueName: \"kubernetes.io/projected/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-kube-api-access-lj89x\") pod \"nova-api-dbfd-account-create-update-lhmrn\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.559826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjd97\" (UniqueName: \"kubernetes.io/projected/7d34ce70-aef8-44bc-872a-be96892f145f-kube-api-access-jjd97\") pod \"nova-cell0-db-create-vvzkx\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.562068 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b797-account-create-update-mfrb6"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.587594 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-677cdf8c9f-j2vl7" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.610813 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1f48-account-create-update-wjdg5"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.611965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.616171 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.633531 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f48-account-create-update-wjdg5"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.659556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ecbe558-f60c-4be1-8dbb-35110ba64185-operator-scripts\") pod \"nova-cell1-db-create-wl5ms\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.659659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d430f5c-b58e-4b87-9ff4-391c6d796215-operator-scripts\") pod \"nova-cell0-b797-account-create-update-mfrb6\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.659718 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c75nd\" (UniqueName: \"kubernetes.io/projected/8ecbe558-f60c-4be1-8dbb-35110ba64185-kube-api-access-c75nd\") pod \"nova-cell1-db-create-wl5ms\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.659755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdftn\" (UniqueName: \"kubernetes.io/projected/9d430f5c-b58e-4b87-9ff4-391c6d796215-kube-api-access-vdftn\") pod \"nova-cell0-b797-account-create-update-mfrb6\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.671606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ecbe558-f60c-4be1-8dbb-35110ba64185-operator-scripts\") pod \"nova-cell1-db-create-wl5ms\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.687746 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/1.log" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.689027 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8fbcb64b8-88w42"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.690335 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.690933 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8fbcb64b8-88w42" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-httpd" containerID="cri-o://42c29a25ab668a8d51496582563028daa1de047fe727efa08fd3504144a43836" gracePeriod=30 Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.691342 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8fbcb64b8-88w42" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-api" containerID="cri-o://cd9cdcb54d005627316cfa56c267d1c64c79db62fb498db101e2f032a6ec6998" gracePeriod=30 Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.700480 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75nd\" (UniqueName: \"kubernetes.io/projected/8ecbe558-f60c-4be1-8dbb-35110ba64185-kube-api-access-c75nd\") pod \"nova-cell1-db-create-wl5ms\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.702478 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/0.log" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.702600 4717 generic.go:334] "Generic (PLEG): container finished" podID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerID="36102092b9c64661dccc8a40947d3bd38f49cb5b7cee6da295a6d3cf1e75fc7b" exitCode=137 Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.702758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerDied","Data":"36102092b9c64661dccc8a40947d3bd38f49cb5b7cee6da295a6d3cf1e75fc7b"} Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.702996 4717 scope.go:117] "RemoveContainer" containerID="0be6efccbce84f3ce85b6927315fc3ea7c9d7fb5689f128e98df6fe068617cb7" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.703536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.726042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" event={"ID":"b1949ec1-5153-4003-b960-68a8f126b72d","Type":"ContainerStarted","Data":"e47a72e8a004af3425045b7db5271c4c4432ddf961e0fd67758850bd9085a25e"} Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.726105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" event={"ID":"b1949ec1-5153-4003-b960-68a8f126b72d","Type":"ContainerStarted","Data":"ba4d944e742ce6b425ec820fdd18b28f88f76adaa691008c266907bffb3847ea"} Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.726968 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.727008 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.748158 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"756781b7-938f-4654-8109-725420287d7b","Type":"ContainerStarted","Data":"305df42d6ecdca4e7fa65c3fe09ba8ac73b004a33611c812e23ea940b056f229"} Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.762317 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prsg8\" (UniqueName: \"kubernetes.io/projected/9186937b-6e90-45ec-9494-aef66cdfe28b-kube-api-access-prsg8\") pod \"nova-cell1-1f48-account-create-update-wjdg5\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.762412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d430f5c-b58e-4b87-9ff4-391c6d796215-operator-scripts\") pod \"nova-cell0-b797-account-create-update-mfrb6\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.762466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdftn\" (UniqueName: \"kubernetes.io/projected/9d430f5c-b58e-4b87-9ff4-391c6d796215-kube-api-access-vdftn\") pod \"nova-cell0-b797-account-create-update-mfrb6\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.762519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9186937b-6e90-45ec-9494-aef66cdfe28b-operator-scripts\") pod \"nova-cell1-1f48-account-create-update-wjdg5\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.764801 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" podStartSLOduration=8.76114344 podStartE2EDuration="8.76114344s" podCreationTimestamp="2026-02-21 22:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:03:59.755130756 +0000 UTC m=+1054.536664378" watchObservedRunningTime="2026-02-21 22:03:59.76114344 +0000 UTC m=+1054.542677062" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.764935 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d430f5c-b58e-4b87-9ff4-391c6d796215-operator-scripts\") pod \"nova-cell0-b797-account-create-update-mfrb6\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.774543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df6bc00e-61fa-43eb-960b-2a1fa5d1d6df","Type":"ContainerDied","Data":"de0cb7e39543de7006a2ae836bb051cd38937afd2157c2960fc87f196e2d795c"} Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.774970 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.794029 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.494237371 podStartE2EDuration="12.794007161s" podCreationTimestamp="2026-02-21 22:03:47 +0000 UTC" firstStartedPulling="2026-02-21 22:03:48.453187594 +0000 UTC m=+1043.234721216" lastFinishedPulling="2026-02-21 22:03:58.752957384 +0000 UTC m=+1053.534491006" observedRunningTime="2026-02-21 22:03:59.780540867 +0000 UTC m=+1054.562074489" watchObservedRunningTime="2026-02-21 22:03:59.794007161 +0000 UTC m=+1054.575540793" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.806887 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.807742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdftn\" (UniqueName: \"kubernetes.io/projected/9d430f5c-b58e-4b87-9ff4-391c6d796215-kube-api-access-vdftn\") pod \"nova-cell0-b797-account-create-update-mfrb6\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.816838 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2e08bfd4-7ef2-4895-89e2-c9265d0adc13","Type":"ContainerDied","Data":"e8757ffb32c00b8d106e292ccb847bf0bb61e5ec5d11f91816df850941bdd62a"} Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.816963 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.856969 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.864228 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9186937b-6e90-45ec-9494-aef66cdfe28b-operator-scripts\") pod \"nova-cell1-1f48-account-create-update-wjdg5\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.864345 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prsg8\" (UniqueName: \"kubernetes.io/projected/9186937b-6e90-45ec-9494-aef66cdfe28b-kube-api-access-prsg8\") pod \"nova-cell1-1f48-account-create-update-wjdg5\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.865345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9186937b-6e90-45ec-9494-aef66cdfe28b-operator-scripts\") pod \"nova-cell1-1f48-account-create-update-wjdg5\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.884750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prsg8\" (UniqueName: \"kubernetes.io/projected/9186937b-6e90-45ec-9494-aef66cdfe28b-kube-api-access-prsg8\") pod \"nova-cell1-1f48-account-create-update-wjdg5\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.948623 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.955023 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.957581 4717 scope.go:117] "RemoveContainer" containerID="a1eeefa03a4ffad553ca02b4401d4f4714edf9bb2001de411c5257f93d89eeb1" Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.960268 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:03:59 crc kubenswrapper[4717]: I0221 22:03:59.973438 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.046329 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6bc00e-61fa-43eb-960b-2a1fa5d1d6df" path="/var/lib/kubelet/pods/df6bc00e-61fa-43eb-960b-2a1fa5d1d6df/volumes" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.047320 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.047364 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.051000 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.051743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.051756 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.051823 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.052587 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.055426 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.056292 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-g2dk7" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.056428 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.063908 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.064308 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.164922 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/1.log" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.167300 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.179144 4717 scope.go:117] "RemoveContainer" containerID="984d0d934dc14e4af01410f368a4e033baff48094ef3b6a56c4571b86abc662e" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180104 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-run-httpd\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180175 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gdgt\" (UniqueName: \"kubernetes.io/projected/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-api-access-4gdgt\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180202 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180281 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-log-httpd\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180317 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180358 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.180392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-config-data\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.182047 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-scripts\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.182077 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xswjh\" (UniqueName: \"kubernetes.io/projected/87baa801-afa3-4f80-abf4-cbffcd2da28e-kube-api-access-xswjh\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.182112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.259311 4717 scope.go:117] "RemoveContainer" containerID="91f2afbd1f8d214e6c8cdb70cbbde386147649062e753feb93f24ca57347cfae" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.282920 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-httpd-config\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.282998 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-public-tls-certs\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-config\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283136 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-ovndb-tls-certs\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283161 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-internal-tls-certs\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283194 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59sbb\" (UniqueName: \"kubernetes.io/projected/acdc8b18-646d-4f3d-8c30-9e80d7b78058-kube-api-access-59sbb\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-combined-ca-bundle\") pod \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\" (UID: \"acdc8b18-646d-4f3d-8c30-9e80d7b78058\") " Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-run-httpd\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283508 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gdgt\" (UniqueName: \"kubernetes.io/projected/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-api-access-4gdgt\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-log-httpd\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283611 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283649 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283682 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-config-data\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283700 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-scripts\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xswjh\" (UniqueName: \"kubernetes.io/projected/87baa801-afa3-4f80-abf4-cbffcd2da28e-kube-api-access-xswjh\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.283756 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.294042 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.294300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-scripts\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.298300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-run-httpd\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.298381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-log-httpd\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.303273 4717 scope.go:117] "RemoveContainer" containerID="ba18616baa2d306c15338c2a8a7f166810b5e29106d040f5a51056a9756e12f5" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.304773 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gvjrw"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.324350 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.324781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.321976 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.326196 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-config-data\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.326287 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.343100 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xswjh\" (UniqueName: \"kubernetes.io/projected/87baa801-afa3-4f80-abf4-cbffcd2da28e-kube-api-access-xswjh\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.345509 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.346062 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdc8b18-646d-4f3d-8c30-9e80d7b78058-kube-api-access-59sbb" (OuterVolumeSpecName: "kube-api-access-59sbb") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "kube-api-access-59sbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.350656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gdgt\" (UniqueName: \"kubernetes.io/projected/3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b-kube-api-access-4gdgt\") pod \"kube-state-metrics-0\" (UID: \"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b\") " pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: W0221 22:04:00.367901 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dfd7221_c036_41a9_be83_4b0d9d1fac4b.slice/crio-c352a93ad42340a963769f55a8745a2c74c8ec315362690e0adb8b7066e0b68c WatchSource:0}: Error finding container c352a93ad42340a963769f55a8745a2c74c8ec315362690e0adb8b7066e0b68c: Status 404 returned error can't find the container with id c352a93ad42340a963769f55a8745a2c74c8ec315362690e0adb8b7066e0b68c Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.375055 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-config" (OuterVolumeSpecName: "config") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.375518 4717 scope.go:117] "RemoveContainer" containerID="69448ec46413811c070cdb37b272c6eeaec778e1fd6a5eca434201575ec96f11" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.385722 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.385755 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.385768 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59sbb\" (UniqueName: \"kubernetes.io/projected/acdc8b18-646d-4f3d-8c30-9e80d7b78058-kube-api-access-59sbb\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.393443 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.396768 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.414080 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.428695 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.451223 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vvzkx"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.451274 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.452351 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "acdc8b18-646d-4f3d-8c30-9e80d7b78058" (UID: "acdc8b18-646d-4f3d-8c30-9e80d7b78058"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.487621 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.487657 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.487668 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.487678 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acdc8b18-646d-4f3d-8c30-9e80d7b78058-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.593463 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dbfd-account-create-update-lhmrn"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.603972 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wl5ms"] Feb 21 22:04:00 crc kubenswrapper[4717]: W0221 22:04:00.620655 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2afd36f_b3bb_4232_8b66_2cc64cf53f6a.slice/crio-cb31d7e38d7d4e6cb6788ccd5c2f456bd8999b611377481fb37f6cdabc761346 WatchSource:0}: Error finding container cb31d7e38d7d4e6cb6788ccd5c2f456bd8999b611377481fb37f6cdabc761346: Status 404 returned error can't find the container with id cb31d7e38d7d4e6cb6788ccd5c2f456bd8999b611377481fb37f6cdabc761346 Feb 21 22:04:00 crc kubenswrapper[4717]: W0221 22:04:00.642433 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecbe558_f60c_4be1_8dbb_35110ba64185.slice/crio-457811339de51543832667779551767f7a7b275f1c3602ab34d5b47348214166 WatchSource:0}: Error finding container 457811339de51543832667779551767f7a7b275f1c3602ab34d5b47348214166: Status 404 returned error can't find the container with id 457811339de51543832667779551767f7a7b275f1c3602ab34d5b47348214166 Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.814086 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b797-account-create-update-mfrb6"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.844304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wl5ms" event={"ID":"8ecbe558-f60c-4be1-8dbb-35110ba64185","Type":"ContainerStarted","Data":"457811339de51543832667779551767f7a7b275f1c3602ab34d5b47348214166"} Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.847820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gvjrw" event={"ID":"4dfd7221-c036-41a9-be83-4b0d9d1fac4b","Type":"ContainerStarted","Data":"fcd44a1187b50a99d3c545854bd8b3f477554d831ffa40d1532d971cb1dc03c8"} Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.847851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gvjrw" event={"ID":"4dfd7221-c036-41a9-be83-4b0d9d1fac4b","Type":"ContainerStarted","Data":"c352a93ad42340a963769f55a8745a2c74c8ec315362690e0adb8b7066e0b68c"} Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.852357 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f48-account-create-update-wjdg5"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.853710 4717 generic.go:334] "Generic (PLEG): container finished" podID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerID="42c29a25ab668a8d51496582563028daa1de047fe727efa08fd3504144a43836" exitCode=0 Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.853769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8fbcb64b8-88w42" event={"ID":"7dccedac-c29e-4ae2-bfac-d55b444cb715","Type":"ContainerDied","Data":"42c29a25ab668a8d51496582563028daa1de047fe727efa08fd3504144a43836"} Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.855090 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7cc86c585-7kk6w_acdc8b18-646d-4f3d-8c30-9e80d7b78058/neutron-api/1.log" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.926084 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cc86c585-7kk6w" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.926566 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cc86c585-7kk6w" event={"ID":"acdc8b18-646d-4f3d-8c30-9e80d7b78058","Type":"ContainerDied","Data":"1ab1aa5ab303eff671d3ed9a9f6be1191a457c3cc5eb7df429dab3171096ada4"} Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.926614 4717 scope.go:117] "RemoveContainer" containerID="36102092b9c64661dccc8a40947d3bd38f49cb5b7cee6da295a6d3cf1e75fc7b" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.936901 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-gvjrw" podStartSLOduration=1.9368798969999999 podStartE2EDuration="1.936879897s" podCreationTimestamp="2026-02-21 22:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:00.926893137 +0000 UTC m=+1055.708426759" watchObservedRunningTime="2026-02-21 22:04:00.936879897 +0000 UTC m=+1055.718413519" Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.945701 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 21 22:04:00 crc kubenswrapper[4717]: I0221 22:04:00.946461 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" event={"ID":"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a","Type":"ContainerStarted","Data":"cb31d7e38d7d4e6cb6788ccd5c2f456bd8999b611377481fb37f6cdabc761346"} Feb 21 22:04:01 crc kubenswrapper[4717]: W0221 22:04:01.055294 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f8bbbe9_6ba9_48a4_8ade_3e41ee668a2b.slice/crio-f424425cfc64047954d6e8cc1ed09f26357835f46c48ed9d6a7408cb30bd3a94 WatchSource:0}: Error finding container f424425cfc64047954d6e8cc1ed09f26357835f46c48ed9d6a7408cb30bd3a94: Status 404 returned error can't find the container with id f424425cfc64047954d6e8cc1ed09f26357835f46c48ed9d6a7408cb30bd3a94 Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.071597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vvzkx" event={"ID":"7d34ce70-aef8-44bc-872a-be96892f145f","Type":"ContainerStarted","Data":"1c20a06b0e7e5a9f33013e53a59c5d71c23b9e60b25e73ebee0f4d48c2ad3362"} Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.071646 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vvzkx" event={"ID":"7d34ce70-aef8-44bc-872a-be96892f145f","Type":"ContainerStarted","Data":"2f4a60c0bd8ce9783c29e789fa52a62e4a13df62571e0dd426ac6500e36528bc"} Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.096003 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.096245 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-log" containerID="cri-o://860db4a3cce90dd8b9140766be130786d23775f01cb01b8ebb1444ca9e310a10" gracePeriod=30 Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.096396 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-httpd" containerID="cri-o://1e689ee64fcdd8877957d289ad22bbeee9f9bc12f3a4a3b3b4e6968b03d20826" gracePeriod=30 Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.143083 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-vvzkx" podStartSLOduration=2.143061267 podStartE2EDuration="2.143061267s" podCreationTimestamp="2026-02-21 22:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:01.099435978 +0000 UTC m=+1055.880969590" watchObservedRunningTime="2026-02-21 22:04:01.143061267 +0000 UTC m=+1055.924594889" Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.216408 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.262849 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cc86c585-7kk6w"] Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.265022 4717 scope.go:117] "RemoveContainer" containerID="c58eaf12666196ccde2e6a9e1513d9eb0946fea8f0bf4fe7203c1d2335ba8091" Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.273807 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7cc86c585-7kk6w"] Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.989446 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e08bfd4-7ef2-4895-89e2-c9265d0adc13" path="/var/lib/kubelet/pods/2e08bfd4-7ef2-4895-89e2-c9265d0adc13/volumes" Feb 21 22:04:01 crc kubenswrapper[4717]: I0221 22:04:01.990200 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" path="/var/lib/kubelet/pods/acdc8b18-646d-4f3d-8c30-9e80d7b78058/volumes" Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.091535 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.101162 4717 generic.go:334] "Generic (PLEG): container finished" podID="8ecbe558-f60c-4be1-8dbb-35110ba64185" containerID="d572024a8c5af8297462282be44724c6982034ea6de6810b1a38b9cb6a3d2b96" exitCode=0 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.101222 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wl5ms" event={"ID":"8ecbe558-f60c-4be1-8dbb-35110ba64185","Type":"ContainerDied","Data":"d572024a8c5af8297462282be44724c6982034ea6de6810b1a38b9cb6a3d2b96"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.107405 4717 generic.go:334] "Generic (PLEG): container finished" podID="9186937b-6e90-45ec-9494-aef66cdfe28b" containerID="4e6667a53120acec0f2b5bc99ee5bea5984df76c459961e4c47d0fd7a9383d4c" exitCode=0 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.107519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" event={"ID":"9186937b-6e90-45ec-9494-aef66cdfe28b","Type":"ContainerDied","Data":"4e6667a53120acec0f2b5bc99ee5bea5984df76c459961e4c47d0fd7a9383d4c"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.107565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" event={"ID":"9186937b-6e90-45ec-9494-aef66cdfe28b","Type":"ContainerStarted","Data":"553028453a53eaec63db6a357fca013088dc5e95313262c1d8939ee9016a4ece"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.109989 4717 generic.go:334] "Generic (PLEG): container finished" podID="7d34ce70-aef8-44bc-872a-be96892f145f" containerID="1c20a06b0e7e5a9f33013e53a59c5d71c23b9e60b25e73ebee0f4d48c2ad3362" exitCode=0 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.110088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vvzkx" event={"ID":"7d34ce70-aef8-44bc-872a-be96892f145f","Type":"ContainerDied","Data":"1c20a06b0e7e5a9f33013e53a59c5d71c23b9e60b25e73ebee0f4d48c2ad3362"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.114647 4717 generic.go:334] "Generic (PLEG): container finished" podID="9d430f5c-b58e-4b87-9ff4-391c6d796215" containerID="230f2a9545a84d795c8ddd65b06ca9b914295efb8d182ab283e3028179e3398d" exitCode=0 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.114731 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" event={"ID":"9d430f5c-b58e-4b87-9ff4-391c6d796215","Type":"ContainerDied","Data":"230f2a9545a84d795c8ddd65b06ca9b914295efb8d182ab283e3028179e3398d"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.114757 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" event={"ID":"9d430f5c-b58e-4b87-9ff4-391c6d796215","Type":"ContainerStarted","Data":"08ecb4312456d16d95f30450545bdf4f66b77e23763661adb4fd6b50ab3ede2a"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.116952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b","Type":"ContainerStarted","Data":"f424425cfc64047954d6e8cc1ed09f26357835f46c48ed9d6a7408cb30bd3a94"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.126100 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerStarted","Data":"f44bab6d6ad295b226616c3fb4bd8cca32f45114eab8b1228dec16c2e3393f90"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.128753 4717 generic.go:334] "Generic (PLEG): container finished" podID="d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" containerID="4340e361658741eaff1dbdeeba057fca6c5da84ba60cbd0228467f7e3eab75f4" exitCode=0 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.128849 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" event={"ID":"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a","Type":"ContainerDied","Data":"4340e361658741eaff1dbdeeba057fca6c5da84ba60cbd0228467f7e3eab75f4"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.134849 4717 generic.go:334] "Generic (PLEG): container finished" podID="4dfd7221-c036-41a9-be83-4b0d9d1fac4b" containerID="fcd44a1187b50a99d3c545854bd8b3f477554d831ffa40d1532d971cb1dc03c8" exitCode=0 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.134965 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gvjrw" event={"ID":"4dfd7221-c036-41a9-be83-4b0d9d1fac4b","Type":"ContainerDied","Data":"fcd44a1187b50a99d3c545854bd8b3f477554d831ffa40d1532d971cb1dc03c8"} Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.145285 4717 generic.go:334] "Generic (PLEG): container finished" podID="52df3aba-f914-4946-877c-696b2a29635e" containerID="860db4a3cce90dd8b9140766be130786d23775f01cb01b8ebb1444ca9e310a10" exitCode=143 Feb 21 22:04:02 crc kubenswrapper[4717]: I0221 22:04:02.145543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52df3aba-f914-4946-877c-696b2a29635e","Type":"ContainerDied","Data":"860db4a3cce90dd8b9140766be130786d23775f01cb01b8ebb1444ca9e310a10"} Feb 21 22:04:03 crc kubenswrapper[4717]: I0221 22:04:03.167263 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b","Type":"ContainerStarted","Data":"80b0a1296f92a0c4fa64f4714c6aa545d01336d0df5ad2ebe2f3c83f8170d528"} Feb 21 22:04:03 crc kubenswrapper[4717]: I0221 22:04:03.169266 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 21 22:04:03 crc kubenswrapper[4717]: I0221 22:04:03.171675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerStarted","Data":"a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90"} Feb 21 22:04:03 crc kubenswrapper[4717]: I0221 22:04:03.171723 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerStarted","Data":"aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82"} Feb 21 22:04:03 crc kubenswrapper[4717]: I0221 22:04:03.186707 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.627396278 podStartE2EDuration="4.186682485s" podCreationTimestamp="2026-02-21 22:03:59 +0000 UTC" firstStartedPulling="2026-02-21 22:04:01.090030981 +0000 UTC m=+1055.871564603" lastFinishedPulling="2026-02-21 22:04:01.649317188 +0000 UTC m=+1056.430850810" observedRunningTime="2026-02-21 22:04:03.184767698 +0000 UTC m=+1057.966301360" watchObservedRunningTime="2026-02-21 22:04:03.186682485 +0000 UTC m=+1057.968216137" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.182982 4717 generic.go:334] "Generic (PLEG): container finished" podID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerID="3e81fcad833c7e7a7f433096a45c795ab83d621d5970448361c95c6961841ed0" exitCode=137 Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.183077 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-669df94976-tmfpb" event={"ID":"7af1cf64-7044-4170-9ba4-bcc17d97cbb2","Type":"ContainerDied","Data":"3e81fcad833c7e7a7f433096a45c795ab83d621d5970448361c95c6961841ed0"} Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.767209 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.772733 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.777915 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.783565 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.791691 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.801683 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929628 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ecbe558-f60c-4be1-8dbb-35110ba64185-operator-scripts\") pod \"8ecbe558-f60c-4be1-8dbb-35110ba64185\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929680 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj89x\" (UniqueName: \"kubernetes.io/projected/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-kube-api-access-lj89x\") pod \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c75nd\" (UniqueName: \"kubernetes.io/projected/8ecbe558-f60c-4be1-8dbb-35110ba64185-kube-api-access-c75nd\") pod \"8ecbe558-f60c-4be1-8dbb-35110ba64185\" (UID: \"8ecbe558-f60c-4be1-8dbb-35110ba64185\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929773 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d430f5c-b58e-4b87-9ff4-391c6d796215-operator-scripts\") pod \"9d430f5c-b58e-4b87-9ff4-391c6d796215\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929799 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdftn\" (UniqueName: \"kubernetes.io/projected/9d430f5c-b58e-4b87-9ff4-391c6d796215-kube-api-access-vdftn\") pod \"9d430f5c-b58e-4b87-9ff4-391c6d796215\" (UID: \"9d430f5c-b58e-4b87-9ff4-391c6d796215\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929824 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjd97\" (UniqueName: \"kubernetes.io/projected/7d34ce70-aef8-44bc-872a-be96892f145f-kube-api-access-jjd97\") pod \"7d34ce70-aef8-44bc-872a-be96892f145f\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929902 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9186937b-6e90-45ec-9494-aef66cdfe28b-operator-scripts\") pod \"9186937b-6e90-45ec-9494-aef66cdfe28b\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86vkt\" (UniqueName: \"kubernetes.io/projected/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-kube-api-access-86vkt\") pod \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.929952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d34ce70-aef8-44bc-872a-be96892f145f-operator-scripts\") pod \"7d34ce70-aef8-44bc-872a-be96892f145f\" (UID: \"7d34ce70-aef8-44bc-872a-be96892f145f\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.930042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prsg8\" (UniqueName: \"kubernetes.io/projected/9186937b-6e90-45ec-9494-aef66cdfe28b-kube-api-access-prsg8\") pod \"9186937b-6e90-45ec-9494-aef66cdfe28b\" (UID: \"9186937b-6e90-45ec-9494-aef66cdfe28b\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.930065 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-operator-scripts\") pod \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\" (UID: \"4dfd7221-c036-41a9-be83-4b0d9d1fac4b\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.930087 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-operator-scripts\") pod \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\" (UID: \"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a\") " Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.932650 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4dfd7221-c036-41a9-be83-4b0d9d1fac4b" (UID: "4dfd7221-c036-41a9-be83-4b0d9d1fac4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.932685 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d430f5c-b58e-4b87-9ff4-391c6d796215-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d430f5c-b58e-4b87-9ff4-391c6d796215" (UID: "9d430f5c-b58e-4b87-9ff4-391c6d796215"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.932710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ecbe558-f60c-4be1-8dbb-35110ba64185-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ecbe558-f60c-4be1-8dbb-35110ba64185" (UID: "8ecbe558-f60c-4be1-8dbb-35110ba64185"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.932935 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d34ce70-aef8-44bc-872a-be96892f145f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d34ce70-aef8-44bc-872a-be96892f145f" (UID: "7d34ce70-aef8-44bc-872a-be96892f145f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.933122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9186937b-6e90-45ec-9494-aef66cdfe28b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9186937b-6e90-45ec-9494-aef66cdfe28b" (UID: "9186937b-6e90-45ec-9494-aef66cdfe28b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.934122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" (UID: "d2afd36f-b3bb-4232-8b66-2cc64cf53f6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.942375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9186937b-6e90-45ec-9494-aef66cdfe28b-kube-api-access-prsg8" (OuterVolumeSpecName: "kube-api-access-prsg8") pod "9186937b-6e90-45ec-9494-aef66cdfe28b" (UID: "9186937b-6e90-45ec-9494-aef66cdfe28b"). InnerVolumeSpecName "kube-api-access-prsg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.959111 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d430f5c-b58e-4b87-9ff4-391c6d796215-kube-api-access-vdftn" (OuterVolumeSpecName: "kube-api-access-vdftn") pod "9d430f5c-b58e-4b87-9ff4-391c6d796215" (UID: "9d430f5c-b58e-4b87-9ff4-391c6d796215"). InnerVolumeSpecName "kube-api-access-vdftn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.959151 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d34ce70-aef8-44bc-872a-be96892f145f-kube-api-access-jjd97" (OuterVolumeSpecName: "kube-api-access-jjd97") pod "7d34ce70-aef8-44bc-872a-be96892f145f" (UID: "7d34ce70-aef8-44bc-872a-be96892f145f"). InnerVolumeSpecName "kube-api-access-jjd97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.960312 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-kube-api-access-lj89x" (OuterVolumeSpecName: "kube-api-access-lj89x") pod "d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" (UID: "d2afd36f-b3bb-4232-8b66-2cc64cf53f6a"). InnerVolumeSpecName "kube-api-access-lj89x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.960471 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ecbe558-f60c-4be1-8dbb-35110ba64185-kube-api-access-c75nd" (OuterVolumeSpecName: "kube-api-access-c75nd") pod "8ecbe558-f60c-4be1-8dbb-35110ba64185" (UID: "8ecbe558-f60c-4be1-8dbb-35110ba64185"). InnerVolumeSpecName "kube-api-access-c75nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:04 crc kubenswrapper[4717]: I0221 22:04:04.963626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-kube-api-access-86vkt" (OuterVolumeSpecName: "kube-api-access-86vkt") pod "4dfd7221-c036-41a9-be83-4b0d9d1fac4b" (UID: "4dfd7221-c036-41a9-be83-4b0d9d1fac4b"). InnerVolumeSpecName "kube-api-access-86vkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035639 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9186937b-6e90-45ec-9494-aef66cdfe28b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035675 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86vkt\" (UniqueName: \"kubernetes.io/projected/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-kube-api-access-86vkt\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035689 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d34ce70-aef8-44bc-872a-be96892f145f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035700 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prsg8\" (UniqueName: \"kubernetes.io/projected/9186937b-6e90-45ec-9494-aef66cdfe28b-kube-api-access-prsg8\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035713 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dfd7221-c036-41a9-be83-4b0d9d1fac4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035722 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035732 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ecbe558-f60c-4be1-8dbb-35110ba64185-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035744 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj89x\" (UniqueName: \"kubernetes.io/projected/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a-kube-api-access-lj89x\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035762 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c75nd\" (UniqueName: \"kubernetes.io/projected/8ecbe558-f60c-4be1-8dbb-35110ba64185-kube-api-access-c75nd\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035773 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d430f5c-b58e-4b87-9ff4-391c6d796215-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035787 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdftn\" (UniqueName: \"kubernetes.io/projected/9d430f5c-b58e-4b87-9ff4-391c6d796215-kube-api-access-vdftn\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.035821 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjd97\" (UniqueName: \"kubernetes.io/projected/7d34ce70-aef8-44bc-872a-be96892f145f-kube-api-access-jjd97\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.210473 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vvzkx" event={"ID":"7d34ce70-aef8-44bc-872a-be96892f145f","Type":"ContainerDied","Data":"2f4a60c0bd8ce9783c29e789fa52a62e4a13df62571e0dd426ac6500e36528bc"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.210517 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4a60c0bd8ce9783c29e789fa52a62e4a13df62571e0dd426ac6500e36528bc" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.210597 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vvzkx" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.216711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" event={"ID":"9d430f5c-b58e-4b87-9ff4-391c6d796215","Type":"ContainerDied","Data":"08ecb4312456d16d95f30450545bdf4f66b77e23763661adb4fd6b50ab3ede2a"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.216752 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ecb4312456d16d95f30450545bdf4f66b77e23763661adb4fd6b50ab3ede2a" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.216817 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b797-account-create-update-mfrb6" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.228002 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wl5ms" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.228019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wl5ms" event={"ID":"8ecbe558-f60c-4be1-8dbb-35110ba64185","Type":"ContainerDied","Data":"457811339de51543832667779551767f7a7b275f1c3602ab34d5b47348214166"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.228076 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="457811339de51543832667779551767f7a7b275f1c3602ab34d5b47348214166" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.237959 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.238771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" event={"ID":"9186937b-6e90-45ec-9494-aef66cdfe28b","Type":"ContainerDied","Data":"553028453a53eaec63db6a357fca013088dc5e95313262c1d8939ee9016a4ece"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.238809 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="553028453a53eaec63db6a357fca013088dc5e95313262c1d8939ee9016a4ece" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.238905 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f48-account-create-update-wjdg5" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.250658 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gvjrw" event={"ID":"4dfd7221-c036-41a9-be83-4b0d9d1fac4b","Type":"ContainerDied","Data":"c352a93ad42340a963769f55a8745a2c74c8ec315362690e0adb8b7066e0b68c"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.251057 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c352a93ad42340a963769f55a8745a2c74c8ec315362690e0adb8b7066e0b68c" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.250950 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gvjrw" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.257037 4717 generic.go:334] "Generic (PLEG): container finished" podID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerID="cd9cdcb54d005627316cfa56c267d1c64c79db62fb498db101e2f032a6ec6998" exitCode=0 Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.257774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8fbcb64b8-88w42" event={"ID":"7dccedac-c29e-4ae2-bfac-d55b444cb715","Type":"ContainerDied","Data":"cd9cdcb54d005627316cfa56c267d1c64c79db62fb498db101e2f032a6ec6998"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.261509 4717 generic.go:334] "Generic (PLEG): container finished" podID="52df3aba-f914-4946-877c-696b2a29635e" containerID="1e689ee64fcdd8877957d289ad22bbeee9f9bc12f3a4a3b3b4e6968b03d20826" exitCode=0 Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.261636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52df3aba-f914-4946-877c-696b2a29635e","Type":"ContainerDied","Data":"1e689ee64fcdd8877957d289ad22bbeee9f9bc12f3a4a3b3b4e6968b03d20826"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.276357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" event={"ID":"d2afd36f-b3bb-4232-8b66-2cc64cf53f6a","Type":"ContainerDied","Data":"cb31d7e38d7d4e6cb6788ccd5c2f456bd8999b611377481fb37f6cdabc761346"} Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.276414 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb31d7e38d7d4e6cb6788ccd5c2f456bd8999b611377481fb37f6cdabc761346" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.277997 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dbfd-account-create-update-lhmrn" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.341624 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-secret-key\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.341662 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-scripts\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.341793 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-tls-certs\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.342272 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-combined-ca-bundle\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.342357 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9z5\" (UniqueName: \"kubernetes.io/projected/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-kube-api-access-rs9z5\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.342389 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-config-data\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.342417 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-logs\") pod \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\" (UID: \"7af1cf64-7044-4170-9ba4-bcc17d97cbb2\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.343051 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-logs" (OuterVolumeSpecName: "logs") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.350021 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.362605 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-kube-api-access-rs9z5" (OuterVolumeSpecName: "kube-api-access-rs9z5") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "kube-api-access-rs9z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.367389 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-scripts" (OuterVolumeSpecName: "scripts") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.377070 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-config-data" (OuterVolumeSpecName: "config-data") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.384761 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.420154 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "7af1cf64-7044-4170-9ba4-bcc17d97cbb2" (UID: "7af1cf64-7044-4170-9ba4-bcc17d97cbb2"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444891 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444920 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444930 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9z5\" (UniqueName: \"kubernetes.io/projected/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-kube-api-access-rs9z5\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444940 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444949 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444957 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.444966 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7af1cf64-7044-4170-9ba4-bcc17d97cbb2-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.495269 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648575 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-public-tls-certs\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648616 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-logs\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648696 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-scripts\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648735 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q667j\" (UniqueName: \"kubernetes.io/projected/52df3aba-f914-4946-877c-696b2a29635e-kube-api-access-q667j\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648751 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-config-data\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648784 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648818 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-httpd-run\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.648909 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-combined-ca-bundle\") pod \"52df3aba-f914-4946-877c-696b2a29635e\" (UID: \"52df3aba-f914-4946-877c-696b2a29635e\") " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.649695 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-logs" (OuterVolumeSpecName: "logs") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.661289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.661659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52df3aba-f914-4946-877c-696b2a29635e-kube-api-access-q667j" (OuterVolumeSpecName: "kube-api-access-q667j") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "kube-api-access-q667j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.663015 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.669075 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-scripts" (OuterVolumeSpecName: "scripts") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.696454 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.753018 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.753058 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.753069 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q667j\" (UniqueName: \"kubernetes.io/projected/52df3aba-f914-4946-877c-696b2a29635e-kube-api-access-q667j\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.753091 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.753102 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52df3aba-f914-4946-877c-696b2a29635e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.753111 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.806157 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.808063 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.822691 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-config-data" (OuterVolumeSpecName: "config-data") pod "52df3aba-f914-4946-877c-696b2a29635e" (UID: "52df3aba-f914-4946-877c-696b2a29635e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.854270 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.854572 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52df3aba-f914-4946-877c-696b2a29635e-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:05 crc kubenswrapper[4717]: I0221 22:04:05.854581 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.026548 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.158685 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-config\") pod \"7dccedac-c29e-4ae2-bfac-d55b444cb715\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.158776 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-httpd-config\") pod \"7dccedac-c29e-4ae2-bfac-d55b444cb715\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.158834 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-combined-ca-bundle\") pod \"7dccedac-c29e-4ae2-bfac-d55b444cb715\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.158970 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-ovndb-tls-certs\") pod \"7dccedac-c29e-4ae2-bfac-d55b444cb715\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.158998 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grnf4\" (UniqueName: \"kubernetes.io/projected/7dccedac-c29e-4ae2-bfac-d55b444cb715-kube-api-access-grnf4\") pod \"7dccedac-c29e-4ae2-bfac-d55b444cb715\" (UID: \"7dccedac-c29e-4ae2-bfac-d55b444cb715\") " Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.163136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dccedac-c29e-4ae2-bfac-d55b444cb715-kube-api-access-grnf4" (OuterVolumeSpecName: "kube-api-access-grnf4") pod "7dccedac-c29e-4ae2-bfac-d55b444cb715" (UID: "7dccedac-c29e-4ae2-bfac-d55b444cb715"). InnerVolumeSpecName "kube-api-access-grnf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.163429 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7dccedac-c29e-4ae2-bfac-d55b444cb715" (UID: "7dccedac-c29e-4ae2-bfac-d55b444cb715"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.217937 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-config" (OuterVolumeSpecName: "config") pod "7dccedac-c29e-4ae2-bfac-d55b444cb715" (UID: "7dccedac-c29e-4ae2-bfac-d55b444cb715"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.223771 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dccedac-c29e-4ae2-bfac-d55b444cb715" (UID: "7dccedac-c29e-4ae2-bfac-d55b444cb715"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.247945 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7dccedac-c29e-4ae2-bfac-d55b444cb715" (UID: "7dccedac-c29e-4ae2-bfac-d55b444cb715"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.260932 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.261154 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.261166 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grnf4\" (UniqueName: \"kubernetes.io/projected/7dccedac-c29e-4ae2-bfac-d55b444cb715-kube-api-access-grnf4\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.261179 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.261190 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dccedac-c29e-4ae2-bfac-d55b444cb715-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.289317 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerStarted","Data":"3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd"} Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.291257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8fbcb64b8-88w42" event={"ID":"7dccedac-c29e-4ae2-bfac-d55b444cb715","Type":"ContainerDied","Data":"4c0e2534efe08bea1bd840cb0bc7a32c8d7b2606165a5b5e6dc3be7b59fe0a6f"} Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.291300 4717 scope.go:117] "RemoveContainer" containerID="42c29a25ab668a8d51496582563028daa1de047fe727efa08fd3504144a43836" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.291453 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8fbcb64b8-88w42" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.297586 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52df3aba-f914-4946-877c-696b2a29635e","Type":"ContainerDied","Data":"592ebc94154f46d3064064dae5eb1a8f5e6aba38c2cd9cf663ee1a660face980"} Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.297622 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.305888 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-669df94976-tmfpb" event={"ID":"7af1cf64-7044-4170-9ba4-bcc17d97cbb2","Type":"ContainerDied","Data":"6b2c175bec57cb55b86d9f9bc56882730c4439ae1be8aeb645dfee9739677a7a"} Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.306080 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-669df94976-tmfpb" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.322321 4717 scope.go:117] "RemoveContainer" containerID="cd9cdcb54d005627316cfa56c267d1c64c79db62fb498db101e2f032a6ec6998" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.331007 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.340265 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.351237 4717 scope.go:117] "RemoveContainer" containerID="1e689ee64fcdd8877957d289ad22bbeee9f9bc12f3a4a3b3b4e6968b03d20826" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.361889 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362343 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362365 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362381 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon-log" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362389 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon-log" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362402 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d430f5c-b58e-4b87-9ff4-391c6d796215" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362410 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d430f5c-b58e-4b87-9ff4-391c6d796215" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362425 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362431 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362444 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfd7221-c036-41a9-be83-4b0d9d1fac4b" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362450 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfd7221-c036-41a9-be83-4b0d9d1fac4b" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362459 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecbe558-f60c-4be1-8dbb-35110ba64185" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362465 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecbe558-f60c-4be1-8dbb-35110ba64185" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362476 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362482 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362491 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362498 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362512 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362518 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362537 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-log" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362543 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-log" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362561 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362568 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362578 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362584 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362596 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9186937b-6e90-45ec-9494-aef66cdfe28b" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362602 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9186937b-6e90-45ec-9494-aef66cdfe28b" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.362610 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d34ce70-aef8-44bc-872a-be96892f145f" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362617 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d34ce70-aef8-44bc-872a-be96892f145f" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362822 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362841 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362855 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362923 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362935 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362954 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d430f5c-b58e-4b87-9ff4-391c6d796215" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362971 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-log" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362984 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d34ce70-aef8-44bc-872a-be96892f145f" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.362995 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" containerName="horizon-log" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363005 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363014 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9186937b-6e90-45ec-9494-aef66cdfe28b" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363025 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df3aba-f914-4946-877c-696b2a29635e" containerName="glance-httpd" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363033 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ecbe558-f60c-4be1-8dbb-35110ba64185" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363046 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfd7221-c036-41a9-be83-4b0d9d1fac4b" containerName="mariadb-database-create" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363056 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" containerName="mariadb-account-create-update" Feb 21 22:04:06 crc kubenswrapper[4717]: E0221 22:04:06.363266 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.363277 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdc8b18-646d-4f3d-8c30-9e80d7b78058" containerName="neutron-api" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.364264 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.369962 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.370268 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.392560 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-669df94976-tmfpb"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.396137 4717 scope.go:117] "RemoveContainer" containerID="860db4a3cce90dd8b9140766be130786d23775f01cb01b8ebb1444ca9e310a10" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.422235 4717 scope.go:117] "RemoveContainer" containerID="81ea984b703415ccda6d6772c5e00a9f9a88a1f920ae9fd881bd21238a2548b2" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.426025 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-669df94976-tmfpb"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.453314 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.465927 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8fbcb64b8-88w42"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.469717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5knk7\" (UniqueName: \"kubernetes.io/projected/a8060c80-2f4b-4099-b5fd-841fadcdb329-kube-api-access-5knk7\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.469809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.469881 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.469920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8060c80-2f4b-4099-b5fd-841fadcdb329-logs\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.469961 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.469989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8060c80-2f4b-4099-b5fd-841fadcdb329-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.470215 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.470257 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.473845 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8fbcb64b8-88w42"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.516604 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.517197 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c4dc8df6c-b88lw" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.571638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.571964 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.572073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8060c80-2f4b-4099-b5fd-841fadcdb329-logs\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.572189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.572277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8060c80-2f4b-4099-b5fd-841fadcdb329-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.572387 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.572471 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.572603 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.573455 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5knk7\" (UniqueName: \"kubernetes.io/projected/a8060c80-2f4b-4099-b5fd-841fadcdb329-kube-api-access-5knk7\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.574963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8060c80-2f4b-4099-b5fd-841fadcdb329-logs\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.575453 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8060c80-2f4b-4099-b5fd-841fadcdb329-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.576690 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-scripts\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.581787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-config-data\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.604285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.624568 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8060c80-2f4b-4099-b5fd-841fadcdb329-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.630261 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.630930 4717 scope.go:117] "RemoveContainer" containerID="3e81fcad833c7e7a7f433096a45c795ab83d621d5970448361c95c6961841ed0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.631094 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-httpd" containerID="cri-o://80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e" gracePeriod=30 Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.631329 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-log" containerID="cri-o://6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c" gracePeriod=30 Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.639972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5knk7\" (UniqueName: \"kubernetes.io/projected/a8060c80-2f4b-4099-b5fd-841fadcdb329-kube-api-access-5knk7\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:06 crc kubenswrapper[4717]: I0221 22:04:06.700197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a8060c80-2f4b-4099-b5fd-841fadcdb329\") " pod="openstack/glance-default-external-api-0" Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.001611 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.317033 4717 generic.go:334] "Generic (PLEG): container finished" podID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerID="6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c" exitCode=143 Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.317066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d64df97f-f950-4a75-b5a1-7497f752a5cb","Type":"ContainerDied","Data":"6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c"} Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.322128 4717 generic.go:334] "Generic (PLEG): container finished" podID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerID="928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612" exitCode=1 Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.322199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerDied","Data":"928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612"} Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.322355 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-central-agent" containerID="cri-o://aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82" gracePeriod=30 Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.322897 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="sg-core" containerID="cri-o://3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd" gracePeriod=30 Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.322944 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-notification-agent" containerID="cri-o://a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90" gracePeriod=30 Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.621026 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.989059 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52df3aba-f914-4946-877c-696b2a29635e" path="/var/lib/kubelet/pods/52df3aba-f914-4946-877c-696b2a29635e/volumes" Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.990241 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af1cf64-7044-4170-9ba4-bcc17d97cbb2" path="/var/lib/kubelet/pods/7af1cf64-7044-4170-9ba4-bcc17d97cbb2/volumes" Feb 21 22:04:07 crc kubenswrapper[4717]: I0221 22:04:07.990774 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dccedac-c29e-4ae2-bfac-d55b444cb715" path="/var/lib/kubelet/pods/7dccedac-c29e-4ae2-bfac-d55b444cb715/volumes" Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.361297 4717 generic.go:334] "Generic (PLEG): container finished" podID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerID="3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd" exitCode=2 Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.361337 4717 generic.go:334] "Generic (PLEG): container finished" podID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerID="a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90" exitCode=0 Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.361385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerDied","Data":"3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd"} Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.361416 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerDied","Data":"a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90"} Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.365698 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8060c80-2f4b-4099-b5fd-841fadcdb329","Type":"ContainerStarted","Data":"78789ac54c1318120fb97b16ff084c40c2b49935099513c693337aa4c5df0915"} Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.365742 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8060c80-2f4b-4099-b5fd-841fadcdb329","Type":"ContainerStarted","Data":"7216fba14d11e9d8a93bccdde50cf52a67ca8db18aa7a1481192603fbd23c70e"} Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.503208 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.504083 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f5bdf5f76-9rjst" Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.584147 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f5656cf84-5gfzr"] Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.584645 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f5656cf84-5gfzr" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-log" containerID="cri-o://d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009" gracePeriod=30 Feb 21 22:04:08 crc kubenswrapper[4717]: I0221 22:04:08.585032 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f5656cf84-5gfzr" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-api" containerID="cri-o://5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911" gracePeriod=30 Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.063066 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.063115 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.406521 4717 generic.go:334] "Generic (PLEG): container finished" podID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerID="d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009" exitCode=143 Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.406787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5656cf84-5gfzr" event={"ID":"18ff72f4-66a6-4e32-aac7-7f55e600a1ae","Type":"ContainerDied","Data":"d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009"} Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.410362 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a8060c80-2f4b-4099-b5fd-841fadcdb329","Type":"ContainerStarted","Data":"2bcf02113b2a910f7d54a1aefde0cc32533663cbcb8e331fbc1bef33229eca52"} Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.833433 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.833415508 podStartE2EDuration="3.833415508s" podCreationTimestamp="2026-02-21 22:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:09.437782219 +0000 UTC m=+1064.219315841" watchObservedRunningTime="2026-02-21 22:04:09.833415508 +0000 UTC m=+1064.614949130" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.834978 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2fx6v"] Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.836022 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.838967 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-46b6j" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.839121 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.839116 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.875182 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2fx6v"] Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.947652 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-scripts\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.947751 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5mj\" (UniqueName: \"kubernetes.io/projected/35c98495-0073-4b91-a0bc-84a2e6f04a01-kube-api-access-rt5mj\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.947791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-config-data\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:09 crc kubenswrapper[4717]: I0221 22:04:09.947835 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.049206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.049612 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-scripts\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.049682 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5mj\" (UniqueName: \"kubernetes.io/projected/35c98495-0073-4b91-a0bc-84a2e6f04a01-kube-api-access-rt5mj\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.049712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-config-data\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.059743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-scripts\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.059971 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.060996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-config-data\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.081424 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5mj\" (UniqueName: \"kubernetes.io/projected/35c98495-0073-4b91-a0bc-84a2e6f04a01-kube-api-access-rt5mj\") pod \"nova-cell0-conductor-db-sync-2fx6v\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.167807 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.349778 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.401288 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.428829 4717 generic.go:334] "Generic (PLEG): container finished" podID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerID="80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e" exitCode=0 Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.429655 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.430049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d64df97f-f950-4a75-b5a1-7497f752a5cb","Type":"ContainerDied","Data":"80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e"} Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.430075 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d64df97f-f950-4a75-b5a1-7497f752a5cb","Type":"ContainerDied","Data":"9e174530a07701efaa6fcf6bbd7666e44ffe3e062cbc6b7b4d3c1e6d6a8dec57"} Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.430090 4717 scope.go:117] "RemoveContainer" containerID="80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460432 4717 scope.go:117] "RemoveContainer" containerID="6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460434 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6pwm\" (UniqueName: \"kubernetes.io/projected/d64df97f-f950-4a75-b5a1-7497f752a5cb-kube-api-access-l6pwm\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-combined-ca-bundle\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460704 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-httpd-run\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460739 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-logs\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460768 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-internal-tls-certs\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460837 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460855 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-scripts\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.460952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-config-data\") pod \"d64df97f-f950-4a75-b5a1-7497f752a5cb\" (UID: \"d64df97f-f950-4a75-b5a1-7497f752a5cb\") " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.461116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.461181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-logs" (OuterVolumeSpecName: "logs") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.461745 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.461767 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d64df97f-f950-4a75-b5a1-7497f752a5cb-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.466786 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64df97f-f950-4a75-b5a1-7497f752a5cb-kube-api-access-l6pwm" (OuterVolumeSpecName: "kube-api-access-l6pwm") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "kube-api-access-l6pwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.467317 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.471140 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-scripts" (OuterVolumeSpecName: "scripts") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.487770 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.502415 4717 scope.go:117] "RemoveContainer" containerID="80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e" Feb 21 22:04:10 crc kubenswrapper[4717]: E0221 22:04:10.502998 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e\": container with ID starting with 80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e not found: ID does not exist" containerID="80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.503044 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e"} err="failed to get container status \"80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e\": rpc error: code = NotFound desc = could not find container \"80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e\": container with ID starting with 80342756a29c71937eb0da4308d5d408a8c98b432d6e960fdf7ca357bccf2a1e not found: ID does not exist" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.503075 4717 scope.go:117] "RemoveContainer" containerID="6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c" Feb 21 22:04:10 crc kubenswrapper[4717]: E0221 22:04:10.503618 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c\": container with ID starting with 6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c not found: ID does not exist" containerID="6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.503647 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c"} err="failed to get container status \"6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c\": rpc error: code = NotFound desc = could not find container \"6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c\": container with ID starting with 6bbcbd8899019791389b55bb5fd8b7d6e32917b828da2b7fef7c7f8b7c22a22c not found: ID does not exist" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.517457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.543590 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-config-data" (OuterVolumeSpecName: "config-data") pod "d64df97f-f950-4a75-b5a1-7497f752a5cb" (UID: "d64df97f-f950-4a75-b5a1-7497f752a5cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.564133 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.564175 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.564185 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.564194 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.564205 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6pwm\" (UniqueName: \"kubernetes.io/projected/d64df97f-f950-4a75-b5a1-7497f752a5cb-kube-api-access-l6pwm\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.565083 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64df97f-f950-4a75-b5a1-7497f752a5cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.582645 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 21 22:04:10 crc kubenswrapper[4717]: W0221 22:04:10.633737 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35c98495_0073_4b91_a0bc_84a2e6f04a01.slice/crio-ed150942f6f32ead24a79dc919a842693ad40e54f9b24331a42ca0bddd62a235 WatchSource:0}: Error finding container ed150942f6f32ead24a79dc919a842693ad40e54f9b24331a42ca0bddd62a235: Status 404 returned error can't find the container with id ed150942f6f32ead24a79dc919a842693ad40e54f9b24331a42ca0bddd62a235 Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.634131 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2fx6v"] Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.666355 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.756264 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.765122 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.782761 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:04:10 crc kubenswrapper[4717]: E0221 22:04:10.783187 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-log" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.783204 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-log" Feb 21 22:04:10 crc kubenswrapper[4717]: E0221 22:04:10.783219 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-httpd" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.783225 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-httpd" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.783404 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-log" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.783419 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" containerName="glance-httpd" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.784408 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.786740 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.786885 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.794144 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.869819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.869911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.869940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.869969 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128a5f99-d2a9-4551-8fd0-45efc6017dab-logs\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.870011 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f472\" (UniqueName: \"kubernetes.io/projected/128a5f99-d2a9-4551-8fd0-45efc6017dab-kube-api-access-7f472\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.870071 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.870093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/128a5f99-d2a9-4551-8fd0-45efc6017dab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.870124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971423 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971535 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971585 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128a5f99-d2a9-4551-8fd0-45efc6017dab-logs\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f472\" (UniqueName: \"kubernetes.io/projected/128a5f99-d2a9-4551-8fd0-45efc6017dab-kube-api-access-7f472\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971709 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.971747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/128a5f99-d2a9-4551-8fd0-45efc6017dab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.972343 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.972359 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/128a5f99-d2a9-4551-8fd0-45efc6017dab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.973044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128a5f99-d2a9-4551-8fd0-45efc6017dab-logs\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.979624 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.980430 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.986549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.986875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/128a5f99-d2a9-4551-8fd0-45efc6017dab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:10 crc kubenswrapper[4717]: I0221 22:04:10.989499 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f472\" (UniqueName: \"kubernetes.io/projected/128a5f99-d2a9-4551-8fd0-45efc6017dab-kube-api-access-7f472\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:11 crc kubenswrapper[4717]: I0221 22:04:11.012814 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"128a5f99-d2a9-4551-8fd0-45efc6017dab\") " pod="openstack/glance-default-internal-api-0" Feb 21 22:04:11 crc kubenswrapper[4717]: I0221 22:04:11.109148 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:11 crc kubenswrapper[4717]: I0221 22:04:11.441135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" event={"ID":"35c98495-0073-4b91-a0bc-84a2e6f04a01","Type":"ContainerStarted","Data":"ed150942f6f32ead24a79dc919a842693ad40e54f9b24331a42ca0bddd62a235"} Feb 21 22:04:11 crc kubenswrapper[4717]: I0221 22:04:11.656070 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.015807 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64df97f-f950-4a75-b5a1-7497f752a5cb" path="/var/lib/kubelet/pods/d64df97f-f950-4a75-b5a1-7497f752a5cb/volumes" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.289678 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.404578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-config-data\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.404954 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-scripts\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.404994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-logs\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.405023 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-internal-tls-certs\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.405177 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-public-tls-certs\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.405202 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-combined-ca-bundle\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.405264 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8twsp\" (UniqueName: \"kubernetes.io/projected/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-kube-api-access-8twsp\") pod \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\" (UID: \"18ff72f4-66a6-4e32-aac7-7f55e600a1ae\") " Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.405528 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-logs" (OuterVolumeSpecName: "logs") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.410762 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-kube-api-access-8twsp" (OuterVolumeSpecName: "kube-api-access-8twsp") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "kube-api-access-8twsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.410983 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-scripts" (OuterVolumeSpecName: "scripts") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.464319 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-config-data" (OuterVolumeSpecName: "config-data") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.468973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"128a5f99-d2a9-4551-8fd0-45efc6017dab","Type":"ContainerStarted","Data":"054137f2989799de56e0143096a17633ae36b63b6b3bd7d679ace852d0eb0149"} Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.469019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"128a5f99-d2a9-4551-8fd0-45efc6017dab","Type":"ContainerStarted","Data":"7f856842bd75d8291ec0ce36770724521b27c71874569363ec34c1bb3f9dd4d0"} Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.473015 4717 generic.go:334] "Generic (PLEG): container finished" podID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerID="5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911" exitCode=0 Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.473057 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5656cf84-5gfzr" event={"ID":"18ff72f4-66a6-4e32-aac7-7f55e600a1ae","Type":"ContainerDied","Data":"5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911"} Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.473082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f5656cf84-5gfzr" event={"ID":"18ff72f4-66a6-4e32-aac7-7f55e600a1ae","Type":"ContainerDied","Data":"505b23ed2e99f35dac5726c71b7cd354c86e1f62bbd4cb2b1c125530378edebf"} Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.473100 4717 scope.go:117] "RemoveContainer" containerID="5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.473225 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f5656cf84-5gfzr" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.482583 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.506847 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8twsp\" (UniqueName: \"kubernetes.io/projected/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-kube-api-access-8twsp\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.506894 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.506903 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.506911 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.506919 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.533500 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.545014 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "18ff72f4-66a6-4e32-aac7-7f55e600a1ae" (UID: "18ff72f4-66a6-4e32-aac7-7f55e600a1ae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.572690 4717 scope.go:117] "RemoveContainer" containerID="d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.598736 4717 scope.go:117] "RemoveContainer" containerID="5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911" Feb 21 22:04:12 crc kubenswrapper[4717]: E0221 22:04:12.599451 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911\": container with ID starting with 5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911 not found: ID does not exist" containerID="5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.599482 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911"} err="failed to get container status \"5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911\": rpc error: code = NotFound desc = could not find container \"5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911\": container with ID starting with 5c4e8c0e66ea1a088792a17483cd986369847244b124d1158e29623ee690c911 not found: ID does not exist" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.599526 4717 scope.go:117] "RemoveContainer" containerID="d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009" Feb 21 22:04:12 crc kubenswrapper[4717]: E0221 22:04:12.600122 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009\": container with ID starting with d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009 not found: ID does not exist" containerID="d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.600164 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009"} err="failed to get container status \"d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009\": rpc error: code = NotFound desc = could not find container \"d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009\": container with ID starting with d19d54f70cf252efcfebc1a1ea0831b791ae64e739391ed03f4b7ed0b0786009 not found: ID does not exist" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.608541 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.608566 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18ff72f4-66a6-4e32-aac7-7f55e600a1ae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.806077 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f5656cf84-5gfzr"] Feb 21 22:04:12 crc kubenswrapper[4717]: I0221 22:04:12.814702 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f5656cf84-5gfzr"] Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.315047 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.420890 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-scripts\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421207 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-log-httpd\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-config-data\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-combined-ca-bundle\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421646 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-run-httpd\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421680 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-sg-core-conf-yaml\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421741 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xswjh\" (UniqueName: \"kubernetes.io/projected/87baa801-afa3-4f80-abf4-cbffcd2da28e-kube-api-access-xswjh\") pod \"87baa801-afa3-4f80-abf4-cbffcd2da28e\" (UID: \"87baa801-afa3-4f80-abf4-cbffcd2da28e\") " Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.421995 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.422286 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.422303 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87baa801-afa3-4f80-abf4-cbffcd2da28e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.441988 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87baa801-afa3-4f80-abf4-cbffcd2da28e-kube-api-access-xswjh" (OuterVolumeSpecName: "kube-api-access-xswjh") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "kube-api-access-xswjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.442991 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-scripts" (OuterVolumeSpecName: "scripts") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.455879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.485954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"128a5f99-d2a9-4551-8fd0-45efc6017dab","Type":"ContainerStarted","Data":"e4215e0f033ce9a5ad51e046adb07c531bd8e0bd9b5bedce3d8daaaebf97de9f"} Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.492131 4717 generic.go:334] "Generic (PLEG): container finished" podID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerID="aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82" exitCode=0 Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.492274 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerDied","Data":"aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82"} Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.492593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87baa801-afa3-4f80-abf4-cbffcd2da28e","Type":"ContainerDied","Data":"f44bab6d6ad295b226616c3fb4bd8cca32f45114eab8b1228dec16c2e3393f90"} Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.492704 4717 scope.go:117] "RemoveContainer" containerID="928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.492344 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.505110 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5048186169999997 podStartE2EDuration="3.504818617s" podCreationTimestamp="2026-02-21 22:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:13.50410466 +0000 UTC m=+1068.285638282" watchObservedRunningTime="2026-02-21 22:04:13.504818617 +0000 UTC m=+1068.286352239" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.525539 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.525570 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xswjh\" (UniqueName: \"kubernetes.io/projected/87baa801-afa3-4f80-abf4-cbffcd2da28e-kube-api-access-xswjh\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.525580 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.530125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.535316 4717 scope.go:117] "RemoveContainer" containerID="3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.541394 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-config-data" (OuterVolumeSpecName: "config-data") pod "87baa801-afa3-4f80-abf4-cbffcd2da28e" (UID: "87baa801-afa3-4f80-abf4-cbffcd2da28e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.565627 4717 scope.go:117] "RemoveContainer" containerID="a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.599961 4717 scope.go:117] "RemoveContainer" containerID="aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.622714 4717 scope.go:117] "RemoveContainer" containerID="928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.623124 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612\": container with ID starting with 928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612 not found: ID does not exist" containerID="928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.623153 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612"} err="failed to get container status \"928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612\": rpc error: code = NotFound desc = could not find container \"928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612\": container with ID starting with 928358688b2c823081af23ba0d21359c61bd2748bd5accc7c373aa47cec18612 not found: ID does not exist" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.623173 4717 scope.go:117] "RemoveContainer" containerID="3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.624021 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd\": container with ID starting with 3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd not found: ID does not exist" containerID="3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.624060 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd"} err="failed to get container status \"3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd\": rpc error: code = NotFound desc = could not find container \"3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd\": container with ID starting with 3ce74597aea0f5586a5b5cdd67a254581cb2e872030d44c6376212846bae64bd not found: ID does not exist" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.624089 4717 scope.go:117] "RemoveContainer" containerID="a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.624740 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90\": container with ID starting with a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90 not found: ID does not exist" containerID="a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.624761 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90"} err="failed to get container status \"a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90\": rpc error: code = NotFound desc = could not find container \"a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90\": container with ID starting with a1af4826f0e7246c15309ff2d931327009ca572006624f4d6dee31ad3acecb90 not found: ID does not exist" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.624779 4717 scope.go:117] "RemoveContainer" containerID="aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.625016 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82\": container with ID starting with aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82 not found: ID does not exist" containerID="aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.625039 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82"} err="failed to get container status \"aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82\": rpc error: code = NotFound desc = could not find container \"aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82\": container with ID starting with aa76b112e5b48beaa8118dc9b12854105037c03337ea9da7c33c9425200cdc82 not found: ID does not exist" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.627422 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.627454 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87baa801-afa3-4f80-abf4-cbffcd2da28e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.826479 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.836181 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.849798 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.850148 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-central-agent" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850166 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-central-agent" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.850190 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="sg-core" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850197 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="sg-core" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.850207 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-log" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850213 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-log" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.850229 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="proxy-httpd" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850235 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="proxy-httpd" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.850248 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-api" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850254 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-api" Feb 21 22:04:13 crc kubenswrapper[4717]: E0221 22:04:13.850263 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-notification-agent" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850268 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-notification-agent" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850411 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="sg-core" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850425 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-central-agent" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850439 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-api" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850450 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="proxy-httpd" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850460 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" containerName="ceilometer-notification-agent" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.850466 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" containerName="placement-log" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.852140 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.854063 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.854464 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.860112 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.867331 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.931482 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.931559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xblws\" (UniqueName: \"kubernetes.io/projected/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-kube-api-access-xblws\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.931689 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-run-httpd\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.931807 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-scripts\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.931940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.932005 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-log-httpd\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.932151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:13 crc kubenswrapper[4717]: I0221 22:04:13.932183 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-config-data\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.002241 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ff72f4-66a6-4e32-aac7-7f55e600a1ae" path="/var/lib/kubelet/pods/18ff72f4-66a6-4e32-aac7-7f55e600a1ae/volumes" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.002821 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87baa801-afa3-4f80-abf4-cbffcd2da28e" path="/var/lib/kubelet/pods/87baa801-afa3-4f80-abf4-cbffcd2da28e/volumes" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034108 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xblws\" (UniqueName: \"kubernetes.io/projected/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-kube-api-access-xblws\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-run-httpd\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-scripts\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034234 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-log-httpd\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034317 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-config-data\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.034839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-run-httpd\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.037492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-log-httpd\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.038448 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.039987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.042468 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.046787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-config-data\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.051297 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-scripts\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.053630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xblws\" (UniqueName: \"kubernetes.io/projected/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-kube-api-access-xblws\") pod \"ceilometer-0\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.166620 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:14 crc kubenswrapper[4717]: I0221 22:04:14.609001 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:14 crc kubenswrapper[4717]: W0221 22:04:14.617348 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1fb82d6_90b7_4b45_8810_f1bb0a3aa276.slice/crio-e859e27a5078a54d203a2990b1f694e5421f366a4f6191cc11902128324ca983 WatchSource:0}: Error finding container e859e27a5078a54d203a2990b1f694e5421f366a4f6191cc11902128324ca983: Status 404 returned error can't find the container with id e859e27a5078a54d203a2990b1f694e5421f366a4f6191cc11902128324ca983 Feb 21 22:04:15 crc kubenswrapper[4717]: I0221 22:04:15.280732 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:15 crc kubenswrapper[4717]: I0221 22:04:15.524371 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerStarted","Data":"e859e27a5078a54d203a2990b1f694e5421f366a4f6191cc11902128324ca983"} Feb 21 22:04:17 crc kubenswrapper[4717]: I0221 22:04:17.002831 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 21 22:04:17 crc kubenswrapper[4717]: I0221 22:04:17.003214 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 21 22:04:17 crc kubenswrapper[4717]: I0221 22:04:17.036917 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 21 22:04:17 crc kubenswrapper[4717]: I0221 22:04:17.048402 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 21 22:04:17 crc kubenswrapper[4717]: I0221 22:04:17.545428 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 21 22:04:17 crc kubenswrapper[4717]: I0221 22:04:17.545789 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 21 22:04:19 crc kubenswrapper[4717]: I0221 22:04:19.441973 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 21 22:04:19 crc kubenswrapper[4717]: I0221 22:04:19.445499 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 21 22:04:20 crc kubenswrapper[4717]: I0221 22:04:20.569117 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerStarted","Data":"af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f"} Feb 21 22:04:20 crc kubenswrapper[4717]: I0221 22:04:20.569570 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerStarted","Data":"3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2"} Feb 21 22:04:20 crc kubenswrapper[4717]: I0221 22:04:20.573221 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" event={"ID":"35c98495-0073-4b91-a0bc-84a2e6f04a01","Type":"ContainerStarted","Data":"d0c5a4385227f76979b13e9221d9bcfee079a7bdc6efd002cbaf270f65e2f854"} Feb 21 22:04:20 crc kubenswrapper[4717]: I0221 22:04:20.599256 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" podStartSLOduration=2.556232326 podStartE2EDuration="11.599240341s" podCreationTimestamp="2026-02-21 22:04:09 +0000 UTC" firstStartedPulling="2026-02-21 22:04:10.636263743 +0000 UTC m=+1065.417797365" lastFinishedPulling="2026-02-21 22:04:19.679271758 +0000 UTC m=+1074.460805380" observedRunningTime="2026-02-21 22:04:20.589069347 +0000 UTC m=+1075.370602969" watchObservedRunningTime="2026-02-21 22:04:20.599240341 +0000 UTC m=+1075.380773963" Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.110095 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.110427 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.153297 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.154432 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.591369 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerStarted","Data":"1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e"} Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.593381 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:21 crc kubenswrapper[4717]: I0221 22:04:21.593430 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.552202 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.559076 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.623519 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-central-agent" containerID="cri-o://3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2" gracePeriod=30 Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.623814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerStarted","Data":"b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde"} Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.623894 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.624204 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="proxy-httpd" containerID="cri-o://b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde" gracePeriod=30 Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.624266 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="sg-core" containerID="cri-o://1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e" gracePeriod=30 Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.624314 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-notification-agent" containerID="cri-o://af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f" gracePeriod=30 Feb 21 22:04:23 crc kubenswrapper[4717]: I0221 22:04:23.666390 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.214786817 podStartE2EDuration="10.666365313s" podCreationTimestamp="2026-02-21 22:04:13 +0000 UTC" firstStartedPulling="2026-02-21 22:04:14.624216368 +0000 UTC m=+1069.405749990" lastFinishedPulling="2026-02-21 22:04:23.075794854 +0000 UTC m=+1077.857328486" observedRunningTime="2026-02-21 22:04:23.659243091 +0000 UTC m=+1078.440776713" watchObservedRunningTime="2026-02-21 22:04:23.666365313 +0000 UTC m=+1078.447898945" Feb 21 22:04:24 crc kubenswrapper[4717]: I0221 22:04:24.634159 4717 generic.go:334] "Generic (PLEG): container finished" podID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerID="b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde" exitCode=0 Feb 21 22:04:24 crc kubenswrapper[4717]: I0221 22:04:24.634426 4717 generic.go:334] "Generic (PLEG): container finished" podID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerID="1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e" exitCode=2 Feb 21 22:04:24 crc kubenswrapper[4717]: I0221 22:04:24.634435 4717 generic.go:334] "Generic (PLEG): container finished" podID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerID="af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f" exitCode=0 Feb 21 22:04:24 crc kubenswrapper[4717]: I0221 22:04:24.634203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerDied","Data":"b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde"} Feb 21 22:04:24 crc kubenswrapper[4717]: I0221 22:04:24.634539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerDied","Data":"1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e"} Feb 21 22:04:24 crc kubenswrapper[4717]: I0221 22:04:24.634553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerDied","Data":"af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f"} Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.450784 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.573944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-config-data\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.573995 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-combined-ca-bundle\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.574028 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-ceilometer-tls-certs\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.574110 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-sg-core-conf-yaml\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.574160 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-scripts\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.574233 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-log-httpd\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.574275 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-run-httpd\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.574304 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xblws\" (UniqueName: \"kubernetes.io/projected/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-kube-api-access-xblws\") pod \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\" (UID: \"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276\") " Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.575120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.575201 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.580184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-scripts" (OuterVolumeSpecName: "scripts") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.581061 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-kube-api-access-xblws" (OuterVolumeSpecName: "kube-api-access-xblws") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "kube-api-access-xblws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.621499 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.665437 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.676795 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xblws\" (UniqueName: \"kubernetes.io/projected/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-kube-api-access-xblws\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.676979 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.677063 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.677145 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.677252 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.677414 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.680653 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.687234 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-config-data" (OuterVolumeSpecName: "config-data") pod "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" (UID: "e1fb82d6-90b7-4b45-8810-f1bb0a3aa276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.713696 4717 generic.go:334] "Generic (PLEG): container finished" podID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerID="3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2" exitCode=0 Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.714052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerDied","Data":"3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2"} Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.714518 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1fb82d6-90b7-4b45-8810-f1bb0a3aa276","Type":"ContainerDied","Data":"e859e27a5078a54d203a2990b1f694e5421f366a4f6191cc11902128324ca983"} Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.714104 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.714719 4717 scope.go:117] "RemoveContainer" containerID="b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.750674 4717 scope.go:117] "RemoveContainer" containerID="1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.768313 4717 scope.go:117] "RemoveContainer" containerID="af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.774019 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.779556 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.779596 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.784228 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.795643 4717 scope.go:117] "RemoveContainer" containerID="3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.801584 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.802002 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="sg-core" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802024 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="sg-core" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.802043 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-central-agent" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802052 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-central-agent" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.802065 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-notification-agent" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802075 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-notification-agent" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.802093 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="proxy-httpd" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802100 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="proxy-httpd" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802297 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="proxy-httpd" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802315 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-notification-agent" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802371 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="ceilometer-central-agent" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.802392 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" containerName="sg-core" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.804350 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.811356 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.811613 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.812039 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.825448 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.831445 4717 scope.go:117] "RemoveContainer" containerID="b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.834512 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde\": container with ID starting with b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde not found: ID does not exist" containerID="b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.834552 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde"} err="failed to get container status \"b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde\": rpc error: code = NotFound desc = could not find container \"b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde\": container with ID starting with b2ff14be9f6554059156344bb2d9862054fcc7daefac9cc2183163d574a53cde not found: ID does not exist" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.834573 4717 scope.go:117] "RemoveContainer" containerID="1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.839518 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e\": container with ID starting with 1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e not found: ID does not exist" containerID="1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.839573 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e"} err="failed to get container status \"1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e\": rpc error: code = NotFound desc = could not find container \"1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e\": container with ID starting with 1ed9bc42082af35a4a8c3fb1701cdaf94d86862fb8ce9a870495ebab2d04f70e not found: ID does not exist" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.839615 4717 scope.go:117] "RemoveContainer" containerID="af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.841180 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f\": container with ID starting with af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f not found: ID does not exist" containerID="af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.841212 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f"} err="failed to get container status \"af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f\": rpc error: code = NotFound desc = could not find container \"af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f\": container with ID starting with af995bdf06c519ea7904d0e90eb8a9c76eb03ca3dbe96f398321e54761b7244f not found: ID does not exist" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.841235 4717 scope.go:117] "RemoveContainer" containerID="3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2" Feb 21 22:04:30 crc kubenswrapper[4717]: E0221 22:04:30.842230 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2\": container with ID starting with 3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2 not found: ID does not exist" containerID="3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.842257 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2"} err="failed to get container status \"3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2\": rpc error: code = NotFound desc = could not find container \"3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2\": container with ID starting with 3868f906c099280dfa217f5c52793530aa3ad605c569f16c70fd916e5617f6f2 not found: ID does not exist" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881246 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881313 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-run-httpd\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-config-data\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881410 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-log-httpd\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881483 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-scripts\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.881530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghd8n\" (UniqueName: \"kubernetes.io/projected/64cab48b-28c7-4f30-9e12-b3001c6a5d35-kube-api-access-ghd8n\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.984841 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-config-data\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.984910 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.984939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-log-httpd\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.984975 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-scripts\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.984992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.985014 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghd8n\" (UniqueName: \"kubernetes.io/projected/64cab48b-28c7-4f30-9e12-b3001c6a5d35-kube-api-access-ghd8n\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.985072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.985093 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-run-httpd\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.985601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-run-httpd\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.990294 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-log-httpd\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.993791 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-config-data\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.998309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-scripts\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:30 crc kubenswrapper[4717]: I0221 22:04:30.998980 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.000107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.003981 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.008375 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghd8n\" (UniqueName: \"kubernetes.io/projected/64cab48b-28c7-4f30-9e12-b3001c6a5d35-kube-api-access-ghd8n\") pod \"ceilometer-0\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " pod="openstack/ceilometer-0" Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.131387 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.577583 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:04:31 crc kubenswrapper[4717]: W0221 22:04:31.599234 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64cab48b_28c7_4f30_9e12_b3001c6a5d35.slice/crio-5754fda59fa5773cc79b9a2580c5680bf3e6cc8eca325624ae66457f14618c88 WatchSource:0}: Error finding container 5754fda59fa5773cc79b9a2580c5680bf3e6cc8eca325624ae66457f14618c88: Status 404 returned error can't find the container with id 5754fda59fa5773cc79b9a2580c5680bf3e6cc8eca325624ae66457f14618c88 Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.723363 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerStarted","Data":"5754fda59fa5773cc79b9a2580c5680bf3e6cc8eca325624ae66457f14618c88"} Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.725809 4717 generic.go:334] "Generic (PLEG): container finished" podID="35c98495-0073-4b91-a0bc-84a2e6f04a01" containerID="d0c5a4385227f76979b13e9221d9bcfee079a7bdc6efd002cbaf270f65e2f854" exitCode=0 Feb 21 22:04:31 crc kubenswrapper[4717]: I0221 22:04:31.725941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" event={"ID":"35c98495-0073-4b91-a0bc-84a2e6f04a01","Type":"ContainerDied","Data":"d0c5a4385227f76979b13e9221d9bcfee079a7bdc6efd002cbaf270f65e2f854"} Feb 21 22:04:32 crc kubenswrapper[4717]: I0221 22:04:32.017658 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1fb82d6-90b7-4b45-8810-f1bb0a3aa276" path="/var/lib/kubelet/pods/e1fb82d6-90b7-4b45-8810-f1bb0a3aa276/volumes" Feb 21 22:04:32 crc kubenswrapper[4717]: I0221 22:04:32.740568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerStarted","Data":"1fc3990da8a449c797db79a0daf55c07976f37c64f662b1070c8a9ccfb737cb1"} Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.225996 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.332808 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-config-data\") pod \"35c98495-0073-4b91-a0bc-84a2e6f04a01\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.332923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-combined-ca-bundle\") pod \"35c98495-0073-4b91-a0bc-84a2e6f04a01\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.332951 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5mj\" (UniqueName: \"kubernetes.io/projected/35c98495-0073-4b91-a0bc-84a2e6f04a01-kube-api-access-rt5mj\") pod \"35c98495-0073-4b91-a0bc-84a2e6f04a01\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.332990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-scripts\") pod \"35c98495-0073-4b91-a0bc-84a2e6f04a01\" (UID: \"35c98495-0073-4b91-a0bc-84a2e6f04a01\") " Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.338816 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-scripts" (OuterVolumeSpecName: "scripts") pod "35c98495-0073-4b91-a0bc-84a2e6f04a01" (UID: "35c98495-0073-4b91-a0bc-84a2e6f04a01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.339263 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c98495-0073-4b91-a0bc-84a2e6f04a01-kube-api-access-rt5mj" (OuterVolumeSpecName: "kube-api-access-rt5mj") pod "35c98495-0073-4b91-a0bc-84a2e6f04a01" (UID: "35c98495-0073-4b91-a0bc-84a2e6f04a01"). InnerVolumeSpecName "kube-api-access-rt5mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.366168 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-config-data" (OuterVolumeSpecName: "config-data") pod "35c98495-0073-4b91-a0bc-84a2e6f04a01" (UID: "35c98495-0073-4b91-a0bc-84a2e6f04a01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.368348 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35c98495-0073-4b91-a0bc-84a2e6f04a01" (UID: "35c98495-0073-4b91-a0bc-84a2e6f04a01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.435326 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.435368 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.435384 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt5mj\" (UniqueName: \"kubernetes.io/projected/35c98495-0073-4b91-a0bc-84a2e6f04a01-kube-api-access-rt5mj\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.435396 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c98495-0073-4b91-a0bc-84a2e6f04a01-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.750181 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" event={"ID":"35c98495-0073-4b91-a0bc-84a2e6f04a01","Type":"ContainerDied","Data":"ed150942f6f32ead24a79dc919a842693ad40e54f9b24331a42ca0bddd62a235"} Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.750237 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed150942f6f32ead24a79dc919a842693ad40e54f9b24331a42ca0bddd62a235" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.750265 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2fx6v" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.909091 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 21 22:04:33 crc kubenswrapper[4717]: E0221 22:04:33.909632 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c98495-0073-4b91-a0bc-84a2e6f04a01" containerName="nova-cell0-conductor-db-sync" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.909662 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c98495-0073-4b91-a0bc-84a2e6f04a01" containerName="nova-cell0-conductor-db-sync" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.909963 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c98495-0073-4b91-a0bc-84a2e6f04a01" containerName="nova-cell0-conductor-db-sync" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.910909 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.915050 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.915352 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-46b6j" Feb 21 22:04:33 crc kubenswrapper[4717]: I0221 22:04:33.925070 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.046099 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxzt\" (UniqueName: \"kubernetes.io/projected/1030ba56-81a8-4d0d-8e3d-c17779adcac6-kube-api-access-zkxzt\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.046495 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1030ba56-81a8-4d0d-8e3d-c17779adcac6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.046644 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1030ba56-81a8-4d0d-8e3d-c17779adcac6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.147848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1030ba56-81a8-4d0d-8e3d-c17779adcac6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.148006 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1030ba56-81a8-4d0d-8e3d-c17779adcac6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.148095 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxzt\" (UniqueName: \"kubernetes.io/projected/1030ba56-81a8-4d0d-8e3d-c17779adcac6-kube-api-access-zkxzt\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.153745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1030ba56-81a8-4d0d-8e3d-c17779adcac6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.160667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1030ba56-81a8-4d0d-8e3d-c17779adcac6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.167921 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxzt\" (UniqueName: \"kubernetes.io/projected/1030ba56-81a8-4d0d-8e3d-c17779adcac6-kube-api-access-zkxzt\") pod \"nova-cell0-conductor-0\" (UID: \"1030ba56-81a8-4d0d-8e3d-c17779adcac6\") " pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.231068 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.717405 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 21 22:04:34 crc kubenswrapper[4717]: W0221 22:04:34.725555 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1030ba56_81a8_4d0d_8e3d_c17779adcac6.slice/crio-c588da56fab6ee76c3b900c53e156f3a18303ac3e70ed6d31594719fcd41be06 WatchSource:0}: Error finding container c588da56fab6ee76c3b900c53e156f3a18303ac3e70ed6d31594719fcd41be06: Status 404 returned error can't find the container with id c588da56fab6ee76c3b900c53e156f3a18303ac3e70ed6d31594719fcd41be06 Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.762023 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerStarted","Data":"d45b0655569eb54f03b33d2ed524b2447f3616daf5c141674884248c96e41f3d"} Feb 21 22:04:34 crc kubenswrapper[4717]: I0221 22:04:34.763304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1030ba56-81a8-4d0d-8e3d-c17779adcac6","Type":"ContainerStarted","Data":"c588da56fab6ee76c3b900c53e156f3a18303ac3e70ed6d31594719fcd41be06"} Feb 21 22:04:35 crc kubenswrapper[4717]: I0221 22:04:35.776927 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerStarted","Data":"05a5b8f2fbc304dad253dac2d9baf95e1ff5e923043963e5682dae0812be1e12"} Feb 21 22:04:35 crc kubenswrapper[4717]: I0221 22:04:35.782608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1030ba56-81a8-4d0d-8e3d-c17779adcac6","Type":"ContainerStarted","Data":"7691161a1e211fb21cf9a1c70948b1d9b8d096f1b53b95652d46869fd13e6d24"} Feb 21 22:04:35 crc kubenswrapper[4717]: I0221 22:04:35.783957 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:35 crc kubenswrapper[4717]: I0221 22:04:35.825680 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.82566207 podStartE2EDuration="2.82566207s" podCreationTimestamp="2026-02-21 22:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:35.817953694 +0000 UTC m=+1090.599487336" watchObservedRunningTime="2026-02-21 22:04:35.82566207 +0000 UTC m=+1090.607195702" Feb 21 22:04:38 crc kubenswrapper[4717]: I0221 22:04:38.812741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerStarted","Data":"58c669d76d700050ee913328cb57d888861f91b56580971ac93a34a27dbf6b17"} Feb 21 22:04:38 crc kubenswrapper[4717]: I0221 22:04:38.813504 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.062741 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.062799 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.279927 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.296299 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.174456346 podStartE2EDuration="9.29628327s" podCreationTimestamp="2026-02-21 22:04:30 +0000 UTC" firstStartedPulling="2026-02-21 22:04:31.601518462 +0000 UTC m=+1086.383052084" lastFinishedPulling="2026-02-21 22:04:37.723345376 +0000 UTC m=+1092.504879008" observedRunningTime="2026-02-21 22:04:38.859530162 +0000 UTC m=+1093.641063824" watchObservedRunningTime="2026-02-21 22:04:39.29628327 +0000 UTC m=+1094.077816892" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.920892 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6nknn"] Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.921961 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.924320 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.924454 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 21 22:04:39 crc kubenswrapper[4717]: I0221 22:04:39.940752 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6nknn"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.054806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-config-data\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.054906 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-scripts\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.054968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.055005 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmm6\" (UniqueName: \"kubernetes.io/projected/15a97228-6638-44d9-a311-70460be3479e-kube-api-access-sqmm6\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.068640 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.072032 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.076920 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.088135 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.147912 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.149475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.158923 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159574 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-config-data\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-scripts\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159667 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dg7\" (UniqueName: \"kubernetes.io/projected/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-kube-api-access-t9dg7\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-logs\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159779 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmm6\" (UniqueName: \"kubernetes.io/projected/15a97228-6638-44d9-a311-70460be3479e-kube-api-access-sqmm6\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.159806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-config-data\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.170396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-scripts\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.170492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.172876 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.174574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-config-data\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.197064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqmm6\" (UniqueName: \"kubernetes.io/projected/15a97228-6638-44d9-a311-70460be3479e-kube-api-access-sqmm6\") pod \"nova-cell0-cell-mapping-6nknn\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.248441 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.268953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dg7\" (UniqueName: \"kubernetes.io/projected/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-kube-api-access-t9dg7\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.268998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.269031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-logs\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.269066 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrvb\" (UniqueName: \"kubernetes.io/projected/5e9ee358-f390-45e9-8abb-c0a421fd67d0-kube-api-access-vzrvb\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.269115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-config-data\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.269134 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.269165 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-config-data\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.269188 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ee358-f390-45e9-8abb-c0a421fd67d0-logs\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.270465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-logs\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.273438 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.287520 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-config-data\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.293952 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-j9l2g"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.295391 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.299406 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dg7\" (UniqueName: \"kubernetes.io/projected/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-kube-api-access-t9dg7\") pod \"nova-api-0\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.337021 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-j9l2g"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.359930 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.361147 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.373010 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376169 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376267 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376289 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrvb\" (UniqueName: \"kubernetes.io/projected/5e9ee358-f390-45e9-8abb-c0a421fd67d0-kube-api-access-vzrvb\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376386 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll2l\" (UniqueName: \"kubernetes.io/projected/b772fc71-6e74-4887-8278-43d737b82e9e-kube-api-access-dll2l\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376429 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376476 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-config\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-config-data\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.376527 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ee358-f390-45e9-8abb-c0a421fd67d0-logs\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.377824 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ee358-f390-45e9-8abb-c0a421fd67d0-logs\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.390086 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.390952 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.393492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-config-data\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.423817 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrvb\" (UniqueName: \"kubernetes.io/projected/5e9ee358-f390-45e9-8abb-c0a421fd67d0-kube-api-access-vzrvb\") pod \"nova-metadata-0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.443976 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.461926 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.463209 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.468873 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.469817 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478076 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2chw\" (UniqueName: \"kubernetes.io/projected/71825248-cfe0-44cf-90d9-ab17c019328d-kube-api-access-p2chw\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478224 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll2l\" (UniqueName: \"kubernetes.io/projected/b772fc71-6e74-4887-8278-43d737b82e9e-kube-api-access-dll2l\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478248 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-config-data\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478272 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478318 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-config\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.478360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.479019 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.479464 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.479550 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.482702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-config\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.484381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.502916 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll2l\" (UniqueName: \"kubernetes.io/projected/b772fc71-6e74-4887-8278-43d737b82e9e-kube-api-access-dll2l\") pod \"dnsmasq-dns-845d6d6f59-j9l2g\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.512822 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.581124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.581172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.581202 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7k8w\" (UniqueName: \"kubernetes.io/projected/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-kube-api-access-r7k8w\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.581243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.581277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2chw\" (UniqueName: \"kubernetes.io/projected/71825248-cfe0-44cf-90d9-ab17c019328d-kube-api-access-p2chw\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.581393 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-config-data\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.602461 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-config-data\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.607932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2chw\" (UniqueName: \"kubernetes.io/projected/71825248-cfe0-44cf-90d9-ab17c019328d-kube-api-access-p2chw\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.622285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.683693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.683754 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7k8w\" (UniqueName: \"kubernetes.io/projected/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-kube-api-access-r7k8w\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.683803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.688149 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.689597 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.705333 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7k8w\" (UniqueName: \"kubernetes.io/projected/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-kube-api-access-r7k8w\") pod \"nova-cell1-novncproxy-0\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.786092 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.816588 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.842359 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.958829 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6nknn"] Feb 21 22:04:40 crc kubenswrapper[4717]: W0221 22:04:40.983996 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15a97228_6638_44d9_a311_70460be3479e.slice/crio-6efb14e15915399aef066d69de4d956bd04f8a5540d1f1eb0eeade92e12279ec WatchSource:0}: Error finding container 6efb14e15915399aef066d69de4d956bd04f8a5540d1f1eb0eeade92e12279ec: Status 404 returned error can't find the container with id 6efb14e15915399aef066d69de4d956bd04f8a5540d1f1eb0eeade92e12279ec Feb 21 22:04:40 crc kubenswrapper[4717]: I0221 22:04:40.985027 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.134106 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:41 crc kubenswrapper[4717]: W0221 22:04:41.137042 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e9ee358_f390_45e9_8abb_c0a421fd67d0.slice/crio-b8613e47b6c7a261e37420a43af14b765028fb185c7ca4b1ee507cfadb30399a WatchSource:0}: Error finding container b8613e47b6c7a261e37420a43af14b765028fb185c7ca4b1ee507cfadb30399a: Status 404 returned error can't find the container with id b8613e47b6c7a261e37420a43af14b765028fb185c7ca4b1ee507cfadb30399a Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.221454 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m5gsm"] Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.223499 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.231018 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.231234 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.245651 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m5gsm"] Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.306400 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-config-data\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.306585 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.306616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqv77\" (UniqueName: \"kubernetes.io/projected/40c514ae-3636-4054-a900-61fb5fe5c598-kube-api-access-nqv77\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.306745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-scripts\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.356964 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-j9l2g"] Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.415490 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.415534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqv77\" (UniqueName: \"kubernetes.io/projected/40c514ae-3636-4054-a900-61fb5fe5c598-kube-api-access-nqv77\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.415612 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-scripts\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.415669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-config-data\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.422038 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.422078 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-config-data\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.443753 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-scripts\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.444535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqv77\" (UniqueName: \"kubernetes.io/projected/40c514ae-3636-4054-a900-61fb5fe5c598-kube-api-access-nqv77\") pod \"nova-cell1-conductor-db-sync-m5gsm\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.497115 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.553839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:04:41 crc kubenswrapper[4717]: W0221 22:04:41.554269 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode82c81fb_76a2_4dc1_b082_cafb14ae4dc3.slice/crio-d84fe3f0e3eb0ee8a83ffa4c452348561d3827c3722a9a7a2705ae1520170e7b WatchSource:0}: Error finding container d84fe3f0e3eb0ee8a83ffa4c452348561d3827c3722a9a7a2705ae1520170e7b: Status 404 returned error can't find the container with id d84fe3f0e3eb0ee8a83ffa4c452348561d3827c3722a9a7a2705ae1520170e7b Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.571296 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.863121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"71825248-cfe0-44cf-90d9-ab17c019328d","Type":"ContainerStarted","Data":"d9c9cc5e8d6ba6dd1ae98c19d59f348dadc13e4a30974b1ba25063194eea0fcc"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.873491 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3","Type":"ContainerStarted","Data":"d84fe3f0e3eb0ee8a83ffa4c452348561d3827c3722a9a7a2705ae1520170e7b"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.875831 4717 generic.go:334] "Generic (PLEG): container finished" podID="b772fc71-6e74-4887-8278-43d737b82e9e" containerID="881fe70bd9ae540109e6c45cd6673ff8e6d21acea9c120bba184d76f3aac40a7" exitCode=0 Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.875937 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" event={"ID":"b772fc71-6e74-4887-8278-43d737b82e9e","Type":"ContainerDied","Data":"881fe70bd9ae540109e6c45cd6673ff8e6d21acea9c120bba184d76f3aac40a7"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.875960 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" event={"ID":"b772fc71-6e74-4887-8278-43d737b82e9e","Type":"ContainerStarted","Data":"d761bd190202385fc13b631c19c035127fb3ababebb10aadd2af14f1a7a179f1"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.881977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6nknn" event={"ID":"15a97228-6638-44d9-a311-70460be3479e","Type":"ContainerStarted","Data":"ae81b976f47fcde895a06f4144759381230a0bfd42120e50886b7ce1da6d3bf2"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.882009 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6nknn" event={"ID":"15a97228-6638-44d9-a311-70460be3479e","Type":"ContainerStarted","Data":"6efb14e15915399aef066d69de4d956bd04f8a5540d1f1eb0eeade92e12279ec"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.883210 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e","Type":"ContainerStarted","Data":"339b6aa707e66be825d19ab9ee4898166ee4fe5fb1d80a3c1d543d1da37ffa60"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.883956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9ee358-f390-45e9-8abb-c0a421fd67d0","Type":"ContainerStarted","Data":"b8613e47b6c7a261e37420a43af14b765028fb185c7ca4b1ee507cfadb30399a"} Feb 21 22:04:41 crc kubenswrapper[4717]: I0221 22:04:41.943631 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6nknn" podStartSLOduration=2.943613241 podStartE2EDuration="2.943613241s" podCreationTimestamp="2026-02-21 22:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:41.911915899 +0000 UTC m=+1096.693449521" watchObservedRunningTime="2026-02-21 22:04:41.943613241 +0000 UTC m=+1096.725146863" Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.111135 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m5gsm"] Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.897268 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" event={"ID":"40c514ae-3636-4054-a900-61fb5fe5c598","Type":"ContainerStarted","Data":"6ee8e5590e0a055de4065fbd758f2f6ea37d66a5c10a9d121772448ea0557df2"} Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.897546 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" event={"ID":"40c514ae-3636-4054-a900-61fb5fe5c598","Type":"ContainerStarted","Data":"30409659ecbc13822f4e87b75a5b30e49d8add72cad1aac67920eee1de6a6f2c"} Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.909020 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" event={"ID":"b772fc71-6e74-4887-8278-43d737b82e9e","Type":"ContainerStarted","Data":"28ef9f785f1ee78a724ecc166a77b5fd7488d03cd2140a5e56a87961a98c89d5"} Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.911024 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.911204 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" podStartSLOduration=1.911186459 podStartE2EDuration="1.911186459s" podCreationTimestamp="2026-02-21 22:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:42.910321109 +0000 UTC m=+1097.691854731" watchObservedRunningTime="2026-02-21 22:04:42.911186459 +0000 UTC m=+1097.692720081" Feb 21 22:04:42 crc kubenswrapper[4717]: I0221 22:04:42.938603 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" podStartSLOduration=2.938582249 podStartE2EDuration="2.938582249s" podCreationTimestamp="2026-02-21 22:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:42.930931585 +0000 UTC m=+1097.712465207" watchObservedRunningTime="2026-02-21 22:04:42.938582249 +0000 UTC m=+1097.720115871" Feb 21 22:04:43 crc kubenswrapper[4717]: I0221 22:04:43.600749 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:04:43 crc kubenswrapper[4717]: I0221 22:04:43.620694 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.954505 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3","Type":"ContainerStarted","Data":"9b5a44b32186bf0ff77932e89566cf94762797a604a2c2cb8f1b7fb6f249a9f0"} Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.954625 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9b5a44b32186bf0ff77932e89566cf94762797a604a2c2cb8f1b7fb6f249a9f0" gracePeriod=30 Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.963802 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e","Type":"ContainerStarted","Data":"b896ceeb989112a80b1a6240c22dc312139bb511e14f890f358b087c229d0eec"} Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.963889 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e","Type":"ContainerStarted","Data":"e60d78b24480d96463df5ca472a07a28120d4cab4e7149f95088660dc4454286"} Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.967476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9ee358-f390-45e9-8abb-c0a421fd67d0","Type":"ContainerStarted","Data":"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5"} Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.967534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9ee358-f390-45e9-8abb-c0a421fd67d0","Type":"ContainerStarted","Data":"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358"} Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.967659 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-log" containerID="cri-o://4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358" gracePeriod=30 Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.968086 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-metadata" containerID="cri-o://1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5" gracePeriod=30 Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.970300 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"71825248-cfe0-44cf-90d9-ab17c019328d","Type":"ContainerStarted","Data":"d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240"} Feb 21 22:04:45 crc kubenswrapper[4717]: I0221 22:04:45.979929 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.810013276 podStartE2EDuration="5.97991097s" podCreationTimestamp="2026-02-21 22:04:40 +0000 UTC" firstStartedPulling="2026-02-21 22:04:41.562315728 +0000 UTC m=+1096.343849350" lastFinishedPulling="2026-02-21 22:04:44.732213412 +0000 UTC m=+1099.513747044" observedRunningTime="2026-02-21 22:04:45.978566217 +0000 UTC m=+1100.760099849" watchObservedRunningTime="2026-02-21 22:04:45.97991097 +0000 UTC m=+1100.761444592" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.009497 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.786400778 podStartE2EDuration="6.009478041s" podCreationTimestamp="2026-02-21 22:04:40 +0000 UTC" firstStartedPulling="2026-02-21 22:04:41.506757941 +0000 UTC m=+1096.288291563" lastFinishedPulling="2026-02-21 22:04:44.729835204 +0000 UTC m=+1099.511368826" observedRunningTime="2026-02-21 22:04:46.005414043 +0000 UTC m=+1100.786947685" watchObservedRunningTime="2026-02-21 22:04:46.009478041 +0000 UTC m=+1100.791011663" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.032622 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.458058499 podStartE2EDuration="6.032601678s" podCreationTimestamp="2026-02-21 22:04:40 +0000 UTC" firstStartedPulling="2026-02-21 22:04:41.155379367 +0000 UTC m=+1095.936912989" lastFinishedPulling="2026-02-21 22:04:44.729922536 +0000 UTC m=+1099.511456168" observedRunningTime="2026-02-21 22:04:46.022039973 +0000 UTC m=+1100.803573595" watchObservedRunningTime="2026-02-21 22:04:46.032601678 +0000 UTC m=+1100.814135300" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.046778 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.339043105 podStartE2EDuration="6.046760338s" podCreationTimestamp="2026-02-21 22:04:40 +0000 UTC" firstStartedPulling="2026-02-21 22:04:41.018042873 +0000 UTC m=+1095.799576495" lastFinishedPulling="2026-02-21 22:04:44.725760096 +0000 UTC m=+1099.507293728" observedRunningTime="2026-02-21 22:04:46.040426645 +0000 UTC m=+1100.821960267" watchObservedRunningTime="2026-02-21 22:04:46.046760338 +0000 UTC m=+1100.828293960" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.711488 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.847681 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-config-data\") pod \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.847834 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-combined-ca-bundle\") pod \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.847885 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzrvb\" (UniqueName: \"kubernetes.io/projected/5e9ee358-f390-45e9-8abb-c0a421fd67d0-kube-api-access-vzrvb\") pod \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.848551 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9ee358-f390-45e9-8abb-c0a421fd67d0-logs" (OuterVolumeSpecName: "logs") pod "5e9ee358-f390-45e9-8abb-c0a421fd67d0" (UID: "5e9ee358-f390-45e9-8abb-c0a421fd67d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.848034 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ee358-f390-45e9-8abb-c0a421fd67d0-logs\") pod \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\" (UID: \"5e9ee358-f390-45e9-8abb-c0a421fd67d0\") " Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.849840 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ee358-f390-45e9-8abb-c0a421fd67d0-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.858287 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9ee358-f390-45e9-8abb-c0a421fd67d0-kube-api-access-vzrvb" (OuterVolumeSpecName: "kube-api-access-vzrvb") pod "5e9ee358-f390-45e9-8abb-c0a421fd67d0" (UID: "5e9ee358-f390-45e9-8abb-c0a421fd67d0"). InnerVolumeSpecName "kube-api-access-vzrvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.876636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-config-data" (OuterVolumeSpecName: "config-data") pod "5e9ee358-f390-45e9-8abb-c0a421fd67d0" (UID: "5e9ee358-f390-45e9-8abb-c0a421fd67d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.879549 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9ee358-f390-45e9-8abb-c0a421fd67d0" (UID: "5e9ee358-f390-45e9-8abb-c0a421fd67d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.951975 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.952015 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ee358-f390-45e9-8abb-c0a421fd67d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.952030 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzrvb\" (UniqueName: \"kubernetes.io/projected/5e9ee358-f390-45e9-8abb-c0a421fd67d0-kube-api-access-vzrvb\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.982507 4717 generic.go:334] "Generic (PLEG): container finished" podID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerID="1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5" exitCode=0 Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.982536 4717 generic.go:334] "Generic (PLEG): container finished" podID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerID="4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358" exitCode=143 Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.983398 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.984487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9ee358-f390-45e9-8abb-c0a421fd67d0","Type":"ContainerDied","Data":"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5"} Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.984534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9ee358-f390-45e9-8abb-c0a421fd67d0","Type":"ContainerDied","Data":"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358"} Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.984547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9ee358-f390-45e9-8abb-c0a421fd67d0","Type":"ContainerDied","Data":"b8613e47b6c7a261e37420a43af14b765028fb185c7ca4b1ee507cfadb30399a"} Feb 21 22:04:46 crc kubenswrapper[4717]: I0221 22:04:46.984564 4717 scope.go:117] "RemoveContainer" containerID="1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.005675 4717 scope.go:117] "RemoveContainer" containerID="4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.024250 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.039319 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.051487 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:47 crc kubenswrapper[4717]: E0221 22:04:47.052065 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-metadata" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.052082 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-metadata" Feb 21 22:04:47 crc kubenswrapper[4717]: E0221 22:04:47.052134 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-log" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.052144 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-log" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.052387 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-metadata" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.052405 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" containerName="nova-metadata-log" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.053416 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.055805 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.058573 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.062267 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.069343 4717 scope.go:117] "RemoveContainer" containerID="1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5" Feb 21 22:04:47 crc kubenswrapper[4717]: E0221 22:04:47.069910 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5\": container with ID starting with 1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5 not found: ID does not exist" containerID="1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.069969 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5"} err="failed to get container status \"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5\": rpc error: code = NotFound desc = could not find container \"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5\": container with ID starting with 1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5 not found: ID does not exist" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.070007 4717 scope.go:117] "RemoveContainer" containerID="4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358" Feb 21 22:04:47 crc kubenswrapper[4717]: E0221 22:04:47.070503 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358\": container with ID starting with 4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358 not found: ID does not exist" containerID="4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.070542 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358"} err="failed to get container status \"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358\": rpc error: code = NotFound desc = could not find container \"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358\": container with ID starting with 4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358 not found: ID does not exist" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.070574 4717 scope.go:117] "RemoveContainer" containerID="1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.070831 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5"} err="failed to get container status \"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5\": rpc error: code = NotFound desc = could not find container \"1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5\": container with ID starting with 1c630a49ac77c818772af888a18317b279eda27f072b3e38585f15a296a887e5 not found: ID does not exist" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.071000 4717 scope.go:117] "RemoveContainer" containerID="4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.071729 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358"} err="failed to get container status \"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358\": rpc error: code = NotFound desc = could not find container \"4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358\": container with ID starting with 4c373aea2880764376266b9ca3087b86b4f350b66624c926f327a13e693bb358 not found: ID does not exist" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.156900 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-config-data\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.157090 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6k42\" (UniqueName: \"kubernetes.io/projected/c717ce7b-5672-43dd-8cb9-fe1e97adf751-kube-api-access-s6k42\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.157146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c717ce7b-5672-43dd-8cb9-fe1e97adf751-logs\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.157306 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.157333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.259236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-config-data\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.259333 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6k42\" (UniqueName: \"kubernetes.io/projected/c717ce7b-5672-43dd-8cb9-fe1e97adf751-kube-api-access-s6k42\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.259363 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c717ce7b-5672-43dd-8cb9-fe1e97adf751-logs\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.259419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.259444 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.263370 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c717ce7b-5672-43dd-8cb9-fe1e97adf751-logs\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.269892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.270048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-config-data\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.273991 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.293880 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6k42\" (UniqueName: \"kubernetes.io/projected/c717ce7b-5672-43dd-8cb9-fe1e97adf751-kube-api-access-s6k42\") pod \"nova-metadata-0\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.375563 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.817354 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:47 crc kubenswrapper[4717]: W0221 22:04:47.821396 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc717ce7b_5672_43dd_8cb9_fe1e97adf751.slice/crio-50e124d47d1944047ee45b3325c1a8c1ce16916d8301c2f38dc321e89580e41c WatchSource:0}: Error finding container 50e124d47d1944047ee45b3325c1a8c1ce16916d8301c2f38dc321e89580e41c: Status 404 returned error can't find the container with id 50e124d47d1944047ee45b3325c1a8c1ce16916d8301c2f38dc321e89580e41c Feb 21 22:04:47 crc kubenswrapper[4717]: I0221 22:04:47.993721 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9ee358-f390-45e9-8abb-c0a421fd67d0" path="/var/lib/kubelet/pods/5e9ee358-f390-45e9-8abb-c0a421fd67d0/volumes" Feb 21 22:04:48 crc kubenswrapper[4717]: I0221 22:04:48.000469 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c717ce7b-5672-43dd-8cb9-fe1e97adf751","Type":"ContainerStarted","Data":"50e124d47d1944047ee45b3325c1a8c1ce16916d8301c2f38dc321e89580e41c"} Feb 21 22:04:49 crc kubenswrapper[4717]: I0221 22:04:49.037383 4717 generic.go:334] "Generic (PLEG): container finished" podID="15a97228-6638-44d9-a311-70460be3479e" containerID="ae81b976f47fcde895a06f4144759381230a0bfd42120e50886b7ce1da6d3bf2" exitCode=0 Feb 21 22:04:49 crc kubenswrapper[4717]: I0221 22:04:49.037451 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6nknn" event={"ID":"15a97228-6638-44d9-a311-70460be3479e","Type":"ContainerDied","Data":"ae81b976f47fcde895a06f4144759381230a0bfd42120e50886b7ce1da6d3bf2"} Feb 21 22:04:49 crc kubenswrapper[4717]: I0221 22:04:49.041411 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c717ce7b-5672-43dd-8cb9-fe1e97adf751","Type":"ContainerStarted","Data":"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c"} Feb 21 22:04:49 crc kubenswrapper[4717]: I0221 22:04:49.041447 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c717ce7b-5672-43dd-8cb9-fe1e97adf751","Type":"ContainerStarted","Data":"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b"} Feb 21 22:04:49 crc kubenswrapper[4717]: I0221 22:04:49.092456 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.092428833 podStartE2EDuration="2.092428833s" podCreationTimestamp="2026-02-21 22:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:49.088894339 +0000 UTC m=+1103.870427991" watchObservedRunningTime="2026-02-21 22:04:49.092428833 +0000 UTC m=+1103.873962475" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.059074 4717 generic.go:334] "Generic (PLEG): container finished" podID="40c514ae-3636-4054-a900-61fb5fe5c598" containerID="6ee8e5590e0a055de4065fbd758f2f6ea37d66a5c10a9d121772448ea0557df2" exitCode=0 Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.059203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" event={"ID":"40c514ae-3636-4054-a900-61fb5fe5c598","Type":"ContainerDied","Data":"6ee8e5590e0a055de4065fbd758f2f6ea37d66a5c10a9d121772448ea0557df2"} Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.392194 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.392273 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.477576 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.658256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqmm6\" (UniqueName: \"kubernetes.io/projected/15a97228-6638-44d9-a311-70460be3479e-kube-api-access-sqmm6\") pod \"15a97228-6638-44d9-a311-70460be3479e\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.658487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-scripts\") pod \"15a97228-6638-44d9-a311-70460be3479e\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.658636 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-combined-ca-bundle\") pod \"15a97228-6638-44d9-a311-70460be3479e\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.658695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-config-data\") pod \"15a97228-6638-44d9-a311-70460be3479e\" (UID: \"15a97228-6638-44d9-a311-70460be3479e\") " Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.675068 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-scripts" (OuterVolumeSpecName: "scripts") pod "15a97228-6638-44d9-a311-70460be3479e" (UID: "15a97228-6638-44d9-a311-70460be3479e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.680074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a97228-6638-44d9-a311-70460be3479e-kube-api-access-sqmm6" (OuterVolumeSpecName: "kube-api-access-sqmm6") pod "15a97228-6638-44d9-a311-70460be3479e" (UID: "15a97228-6638-44d9-a311-70460be3479e"). InnerVolumeSpecName "kube-api-access-sqmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.697962 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-config-data" (OuterVolumeSpecName: "config-data") pod "15a97228-6638-44d9-a311-70460be3479e" (UID: "15a97228-6638-44d9-a311-70460be3479e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.712749 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a97228-6638-44d9-a311-70460be3479e" (UID: "15a97228-6638-44d9-a311-70460be3479e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.761623 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.761670 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.761690 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqmm6\" (UniqueName: \"kubernetes.io/projected/15a97228-6638-44d9-a311-70460be3479e-kube-api-access-sqmm6\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.761710 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a97228-6638-44d9-a311-70460be3479e-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.788648 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.820072 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.821057 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.844069 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.872784 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-gsz4r"] Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.873024 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerName="dnsmasq-dns" containerID="cri-o://d955b1b3aafbd0d0b01af76c2163bd00378a9a0ba0fb36bf775af1818075a4cb" gracePeriod=10 Feb 21 22:04:50 crc kubenswrapper[4717]: I0221 22:04:50.883501 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.081497 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6nknn" event={"ID":"15a97228-6638-44d9-a311-70460be3479e","Type":"ContainerDied","Data":"6efb14e15915399aef066d69de4d956bd04f8a5540d1f1eb0eeade92e12279ec"} Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.081557 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6efb14e15915399aef066d69de4d956bd04f8a5540d1f1eb0eeade92e12279ec" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.081677 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6nknn" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.087367 4717 generic.go:334] "Generic (PLEG): container finished" podID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerID="d955b1b3aafbd0d0b01af76c2163bd00378a9a0ba0fb36bf775af1818075a4cb" exitCode=0 Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.087646 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" event={"ID":"43f65bb5-3e40-46d5-81ef-433c345544ac","Type":"ContainerDied","Data":"d955b1b3aafbd0d0b01af76c2163bd00378a9a0ba0fb36bf775af1818075a4cb"} Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.134851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.348713 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.349590 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-api" containerID="cri-o://b896ceeb989112a80b1a6240c22dc312139bb511e14f890f358b087c229d0eec" gracePeriod=30 Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.349266 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-log" containerID="cri-o://e60d78b24480d96463df5ca472a07a28120d4cab4e7149f95088660dc4454286" gracePeriod=30 Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.359579 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": EOF" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.359786 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": EOF" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.394779 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.394997 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-log" containerID="cri-o://ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b" gracePeriod=30 Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.395335 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-metadata" containerID="cri-o://5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c" gracePeriod=30 Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.673549 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.677728 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.788702 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmcz8\" (UniqueName: \"kubernetes.io/projected/43f65bb5-3e40-46d5-81ef-433c345544ac-kube-api-access-tmcz8\") pod \"43f65bb5-3e40-46d5-81ef-433c345544ac\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.788801 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-sb\") pod \"43f65bb5-3e40-46d5-81ef-433c345544ac\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.788838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqv77\" (UniqueName: \"kubernetes.io/projected/40c514ae-3636-4054-a900-61fb5fe5c598-kube-api-access-nqv77\") pod \"40c514ae-3636-4054-a900-61fb5fe5c598\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.788911 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-nb\") pod \"43f65bb5-3e40-46d5-81ef-433c345544ac\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.788955 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-config\") pod \"43f65bb5-3e40-46d5-81ef-433c345544ac\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.791307 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-swift-storage-0\") pod \"43f65bb5-3e40-46d5-81ef-433c345544ac\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.791827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-svc\") pod \"43f65bb5-3e40-46d5-81ef-433c345544ac\" (UID: \"43f65bb5-3e40-46d5-81ef-433c345544ac\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.792211 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-scripts\") pod \"40c514ae-3636-4054-a900-61fb5fe5c598\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.792261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-combined-ca-bundle\") pod \"40c514ae-3636-4054-a900-61fb5fe5c598\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.792292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-config-data\") pod \"40c514ae-3636-4054-a900-61fb5fe5c598\" (UID: \"40c514ae-3636-4054-a900-61fb5fe5c598\") " Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.797125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f65bb5-3e40-46d5-81ef-433c345544ac-kube-api-access-tmcz8" (OuterVolumeSpecName: "kube-api-access-tmcz8") pod "43f65bb5-3e40-46d5-81ef-433c345544ac" (UID: "43f65bb5-3e40-46d5-81ef-433c345544ac"). InnerVolumeSpecName "kube-api-access-tmcz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.797210 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c514ae-3636-4054-a900-61fb5fe5c598-kube-api-access-nqv77" (OuterVolumeSpecName: "kube-api-access-nqv77") pod "40c514ae-3636-4054-a900-61fb5fe5c598" (UID: "40c514ae-3636-4054-a900-61fb5fe5c598"). InnerVolumeSpecName "kube-api-access-nqv77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.802952 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-scripts" (OuterVolumeSpecName: "scripts") pod "40c514ae-3636-4054-a900-61fb5fe5c598" (UID: "40c514ae-3636-4054-a900-61fb5fe5c598"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.846736 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.859290 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40c514ae-3636-4054-a900-61fb5fe5c598" (UID: "40c514ae-3636-4054-a900-61fb5fe5c598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.874780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-config-data" (OuterVolumeSpecName: "config-data") pod "40c514ae-3636-4054-a900-61fb5fe5c598" (UID: "40c514ae-3636-4054-a900-61fb5fe5c598"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.881765 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43f65bb5-3e40-46d5-81ef-433c345544ac" (UID: "43f65bb5-3e40-46d5-81ef-433c345544ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.884600 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "43f65bb5-3e40-46d5-81ef-433c345544ac" (UID: "43f65bb5-3e40-46d5-81ef-433c345544ac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895065 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895096 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895109 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c514ae-3636-4054-a900-61fb5fe5c598-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895121 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmcz8\" (UniqueName: \"kubernetes.io/projected/43f65bb5-3e40-46d5-81ef-433c345544ac-kube-api-access-tmcz8\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895133 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895143 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqv77\" (UniqueName: \"kubernetes.io/projected/40c514ae-3636-4054-a900-61fb5fe5c598-kube-api-access-nqv77\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.895153 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.900272 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43f65bb5-3e40-46d5-81ef-433c345544ac" (UID: "43f65bb5-3e40-46d5-81ef-433c345544ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.906471 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43f65bb5-3e40-46d5-81ef-433c345544ac" (UID: "43f65bb5-3e40-46d5-81ef-433c345544ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.907182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-config" (OuterVolumeSpecName: "config") pod "43f65bb5-3e40-46d5-81ef-433c345544ac" (UID: "43f65bb5-3e40-46d5-81ef-433c345544ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.988746 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.997116 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.997162 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:51 crc kubenswrapper[4717]: I0221 22:04:51.997172 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43f65bb5-3e40-46d5-81ef-433c345544ac-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.097338 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.097843 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-nova-metadata-tls-certs\") pod \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.097994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6k42\" (UniqueName: \"kubernetes.io/projected/c717ce7b-5672-43dd-8cb9-fe1e97adf751-kube-api-access-s6k42\") pod \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.098056 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c717ce7b-5672-43dd-8cb9-fe1e97adf751-logs\") pod \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.098178 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-gsz4r" event={"ID":"43f65bb5-3e40-46d5-81ef-433c345544ac","Type":"ContainerDied","Data":"99a14157e5a62a7cb0b6c5a97e94f36ee6453d662f89287772fa3c2161b3cdbd"} Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.098211 4717 scope.go:117] "RemoveContainer" containerID="d955b1b3aafbd0d0b01af76c2163bd00378a9a0ba0fb36bf775af1818075a4cb" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.098251 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-combined-ca-bundle\") pod \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.098296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-config-data\") pod \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\" (UID: \"c717ce7b-5672-43dd-8cb9-fe1e97adf751\") " Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.099310 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c717ce7b-5672-43dd-8cb9-fe1e97adf751-logs" (OuterVolumeSpecName: "logs") pod "c717ce7b-5672-43dd-8cb9-fe1e97adf751" (UID: "c717ce7b-5672-43dd-8cb9-fe1e97adf751"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.102818 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c717ce7b-5672-43dd-8cb9-fe1e97adf751-kube-api-access-s6k42" (OuterVolumeSpecName: "kube-api-access-s6k42") pod "c717ce7b-5672-43dd-8cb9-fe1e97adf751" (UID: "c717ce7b-5672-43dd-8cb9-fe1e97adf751"). InnerVolumeSpecName "kube-api-access-s6k42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.105780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" event={"ID":"40c514ae-3636-4054-a900-61fb5fe5c598","Type":"ContainerDied","Data":"30409659ecbc13822f4e87b75a5b30e49d8add72cad1aac67920eee1de6a6f2c"} Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.105828 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30409659ecbc13822f4e87b75a5b30e49d8add72cad1aac67920eee1de6a6f2c" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.105950 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m5gsm" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.109437 4717 generic.go:334] "Generic (PLEG): container finished" podID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerID="e60d78b24480d96463df5ca472a07a28120d4cab4e7149f95088660dc4454286" exitCode=143 Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.109508 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e","Type":"ContainerDied","Data":"e60d78b24480d96463df5ca472a07a28120d4cab4e7149f95088660dc4454286"} Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.120710 4717 generic.go:334] "Generic (PLEG): container finished" podID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerID="5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c" exitCode=0 Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.120758 4717 generic.go:334] "Generic (PLEG): container finished" podID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerID="ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b" exitCode=143 Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.123268 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.123497 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c717ce7b-5672-43dd-8cb9-fe1e97adf751","Type":"ContainerDied","Data":"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c"} Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.123532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c717ce7b-5672-43dd-8cb9-fe1e97adf751","Type":"ContainerDied","Data":"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b"} Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.123547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c717ce7b-5672-43dd-8cb9-fe1e97adf751","Type":"ContainerDied","Data":"50e124d47d1944047ee45b3325c1a8c1ce16916d8301c2f38dc321e89580e41c"} Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.145240 4717 scope.go:117] "RemoveContainer" containerID="f00fb47b8119e27f62a5f7ddb63182aa1b6cd697608ec90c6df25f93bd8cbbfd" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.147522 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-gsz4r"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.156118 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c717ce7b-5672-43dd-8cb9-fe1e97adf751" (UID: "c717ce7b-5672-43dd-8cb9-fe1e97adf751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.168783 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-gsz4r"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.174662 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.175056 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-log" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175069 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-log" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.175080 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-metadata" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175086 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-metadata" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.175116 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerName="dnsmasq-dns" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175123 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerName="dnsmasq-dns" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.175138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerName="init" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175143 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerName="init" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.175151 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a97228-6638-44d9-a311-70460be3479e" containerName="nova-manage" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175158 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a97228-6638-44d9-a311-70460be3479e" containerName="nova-manage" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.175170 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c514ae-3636-4054-a900-61fb5fe5c598" containerName="nova-cell1-conductor-db-sync" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175179 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c514ae-3636-4054-a900-61fb5fe5c598" containerName="nova-cell1-conductor-db-sync" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175427 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a97228-6638-44d9-a311-70460be3479e" containerName="nova-manage" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175442 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-log" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175454 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" containerName="dnsmasq-dns" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175465 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c514ae-3636-4054-a900-61fb5fe5c598" containerName="nova-cell1-conductor-db-sync" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.175474 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" containerName="nova-metadata-metadata" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.176195 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.178832 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.182138 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.184105 4717 scope.go:117] "RemoveContainer" containerID="5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.185611 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-config-data" (OuterVolumeSpecName: "config-data") pod "c717ce7b-5672-43dd-8cb9-fe1e97adf751" (UID: "c717ce7b-5672-43dd-8cb9-fe1e97adf751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.202810 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.202833 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.202842 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6k42\" (UniqueName: \"kubernetes.io/projected/c717ce7b-5672-43dd-8cb9-fe1e97adf751-kube-api-access-s6k42\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.202851 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c717ce7b-5672-43dd-8cb9-fe1e97adf751-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.212017 4717 scope.go:117] "RemoveContainer" containerID="ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.231181 4717 scope.go:117] "RemoveContainer" containerID="5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.231529 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c\": container with ID starting with 5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c not found: ID does not exist" containerID="5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.231556 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c"} err="failed to get container status \"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c\": rpc error: code = NotFound desc = could not find container \"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c\": container with ID starting with 5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c not found: ID does not exist" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.231578 4717 scope.go:117] "RemoveContainer" containerID="ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b" Feb 21 22:04:52 crc kubenswrapper[4717]: E0221 22:04:52.231914 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b\": container with ID starting with ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b not found: ID does not exist" containerID="ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.231939 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b"} err="failed to get container status \"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b\": rpc error: code = NotFound desc = could not find container \"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b\": container with ID starting with ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b not found: ID does not exist" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.231952 4717 scope.go:117] "RemoveContainer" containerID="5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.232320 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c"} err="failed to get container status \"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c\": rpc error: code = NotFound desc = could not find container \"5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c\": container with ID starting with 5c6fb57b20689bd6b69caaec1bc99dbf5391a9c334525b353198d254145f936c not found: ID does not exist" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.232338 4717 scope.go:117] "RemoveContainer" containerID="ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.232636 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b"} err="failed to get container status \"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b\": rpc error: code = NotFound desc = could not find container \"ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b\": container with ID starting with ecce4b51e4d052257d53d0f2f9c04220884767f831a965225c310d829b9e8e1b not found: ID does not exist" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.243015 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c717ce7b-5672-43dd-8cb9-fe1e97adf751" (UID: "c717ce7b-5672-43dd-8cb9-fe1e97adf751"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.304935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e40eb10-e007-4fc0-97ff-de455effb430-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.305527 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-947jq\" (UniqueName: \"kubernetes.io/projected/5e40eb10-e007-4fc0-97ff-de455effb430-kube-api-access-947jq\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.305800 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e40eb10-e007-4fc0-97ff-de455effb430-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.306179 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c717ce7b-5672-43dd-8cb9-fe1e97adf751-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.407655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e40eb10-e007-4fc0-97ff-de455effb430-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.407733 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-947jq\" (UniqueName: \"kubernetes.io/projected/5e40eb10-e007-4fc0-97ff-de455effb430-kube-api-access-947jq\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.407805 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e40eb10-e007-4fc0-97ff-de455effb430-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.412159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e40eb10-e007-4fc0-97ff-de455effb430-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.412591 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e40eb10-e007-4fc0-97ff-de455effb430-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.423949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-947jq\" (UniqueName: \"kubernetes.io/projected/5e40eb10-e007-4fc0-97ff-de455effb430-kube-api-access-947jq\") pod \"nova-cell1-conductor-0\" (UID: \"5e40eb10-e007-4fc0-97ff-de455effb430\") " pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.511367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.649957 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.673827 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.683735 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.688010 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.690169 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.705136 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.721252 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.815075 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-config-data\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.815259 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b088db80-9bd4-4573-9762-45c2be00a3e3-logs\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.815327 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.815378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.815696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xh6\" (UniqueName: \"kubernetes.io/projected/b088db80-9bd4-4573-9762-45c2be00a3e3-kube-api-access-s9xh6\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.917398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.917563 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xh6\" (UniqueName: \"kubernetes.io/projected/b088db80-9bd4-4573-9762-45c2be00a3e3-kube-api-access-s9xh6\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.917610 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-config-data\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.917664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b088db80-9bd4-4573-9762-45c2be00a3e3-logs\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.917698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.918208 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b088db80-9bd4-4573-9762-45c2be00a3e3-logs\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.921648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.921892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-config-data\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.922088 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:52 crc kubenswrapper[4717]: I0221 22:04:52.945732 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xh6\" (UniqueName: \"kubernetes.io/projected/b088db80-9bd4-4573-9762-45c2be00a3e3-kube-api-access-s9xh6\") pod \"nova-metadata-0\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " pod="openstack/nova-metadata-0" Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.013048 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:04:53 crc kubenswrapper[4717]: W0221 22:04:53.058880 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e40eb10_e007_4fc0_97ff_de455effb430.slice/crio-cdf9ae169416f56ed407dc0c64f42f8ebb6a8fde6027338e5e2a9240601c2d6f WatchSource:0}: Error finding container cdf9ae169416f56ed407dc0c64f42f8ebb6a8fde6027338e5e2a9240601c2d6f: Status 404 returned error can't find the container with id cdf9ae169416f56ed407dc0c64f42f8ebb6a8fde6027338e5e2a9240601c2d6f Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.066130 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.130552 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5e40eb10-e007-4fc0-97ff-de455effb430","Type":"ContainerStarted","Data":"cdf9ae169416f56ed407dc0c64f42f8ebb6a8fde6027338e5e2a9240601c2d6f"} Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.133845 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="71825248-cfe0-44cf-90d9-ab17c019328d" containerName="nova-scheduler-scheduler" containerID="cri-o://d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" gracePeriod=30 Feb 21 22:04:53 crc kubenswrapper[4717]: W0221 22:04:53.466592 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb088db80_9bd4_4573_9762_45c2be00a3e3.slice/crio-935ebcf2b02276c33f68e767b5928bce4b7e849e5b21362ee6e1c7313f0a1340 WatchSource:0}: Error finding container 935ebcf2b02276c33f68e767b5928bce4b7e849e5b21362ee6e1c7313f0a1340: Status 404 returned error can't find the container with id 935ebcf2b02276c33f68e767b5928bce4b7e849e5b21362ee6e1c7313f0a1340 Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.468422 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.987327 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f65bb5-3e40-46d5-81ef-433c345544ac" path="/var/lib/kubelet/pods/43f65bb5-3e40-46d5-81ef-433c345544ac/volumes" Feb 21 22:04:53 crc kubenswrapper[4717]: I0221 22:04:53.988083 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c717ce7b-5672-43dd-8cb9-fe1e97adf751" path="/var/lib/kubelet/pods/c717ce7b-5672-43dd-8cb9-fe1e97adf751/volumes" Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.143435 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5e40eb10-e007-4fc0-97ff-de455effb430","Type":"ContainerStarted","Data":"f2ce287a5c9128019331409b2cba4ee4cfc959bc3c5b1e2346944ca5368bea47"} Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.143803 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.146045 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b088db80-9bd4-4573-9762-45c2be00a3e3","Type":"ContainerStarted","Data":"1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc"} Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.146096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b088db80-9bd4-4573-9762-45c2be00a3e3","Type":"ContainerStarted","Data":"fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a"} Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.146115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b088db80-9bd4-4573-9762-45c2be00a3e3","Type":"ContainerStarted","Data":"935ebcf2b02276c33f68e767b5928bce4b7e849e5b21362ee6e1c7313f0a1340"} Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.174357 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.174341359 podStartE2EDuration="2.174341359s" podCreationTimestamp="2026-02-21 22:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:54.171365337 +0000 UTC m=+1108.952898969" watchObservedRunningTime="2026-02-21 22:04:54.174341359 +0000 UTC m=+1108.955874981" Feb 21 22:04:54 crc kubenswrapper[4717]: I0221 22:04:54.196034 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.19601664 podStartE2EDuration="2.19601664s" podCreationTimestamp="2026-02-21 22:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:54.192780872 +0000 UTC m=+1108.974314504" watchObservedRunningTime="2026-02-21 22:04:54.19601664 +0000 UTC m=+1108.977550272" Feb 21 22:04:55 crc kubenswrapper[4717]: E0221 22:04:55.819436 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 21 22:04:55 crc kubenswrapper[4717]: E0221 22:04:55.822284 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 21 22:04:55 crc kubenswrapper[4717]: E0221 22:04:55.824696 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 21 22:04:55 crc kubenswrapper[4717]: E0221 22:04:55.824788 4717 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="71825248-cfe0-44cf-90d9-ab17c019328d" containerName="nova-scheduler-scheduler" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.063253 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.181349 4717 generic.go:334] "Generic (PLEG): container finished" podID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerID="b896ceeb989112a80b1a6240c22dc312139bb511e14f890f358b087c229d0eec" exitCode=0 Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.181644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e","Type":"ContainerDied","Data":"b896ceeb989112a80b1a6240c22dc312139bb511e14f890f358b087c229d0eec"} Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.189345 4717 generic.go:334] "Generic (PLEG): container finished" podID="71825248-cfe0-44cf-90d9-ab17c019328d" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" exitCode=0 Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.189377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"71825248-cfe0-44cf-90d9-ab17c019328d","Type":"ContainerDied","Data":"d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240"} Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.189397 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"71825248-cfe0-44cf-90d9-ab17c019328d","Type":"ContainerDied","Data":"d9c9cc5e8d6ba6dd1ae98c19d59f348dadc13e4a30974b1ba25063194eea0fcc"} Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.189412 4717 scope.go:117] "RemoveContainer" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.189411 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.206541 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-combined-ca-bundle\") pod \"71825248-cfe0-44cf-90d9-ab17c019328d\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.206654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2chw\" (UniqueName: \"kubernetes.io/projected/71825248-cfe0-44cf-90d9-ab17c019328d-kube-api-access-p2chw\") pod \"71825248-cfe0-44cf-90d9-ab17c019328d\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.206766 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-config-data\") pod \"71825248-cfe0-44cf-90d9-ab17c019328d\" (UID: \"71825248-cfe0-44cf-90d9-ab17c019328d\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.228309 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71825248-cfe0-44cf-90d9-ab17c019328d-kube-api-access-p2chw" (OuterVolumeSpecName: "kube-api-access-p2chw") pod "71825248-cfe0-44cf-90d9-ab17c019328d" (UID: "71825248-cfe0-44cf-90d9-ab17c019328d"). InnerVolumeSpecName "kube-api-access-p2chw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.228512 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.249628 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-config-data" (OuterVolumeSpecName: "config-data") pod "71825248-cfe0-44cf-90d9-ab17c019328d" (UID: "71825248-cfe0-44cf-90d9-ab17c019328d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.254134 4717 scope.go:117] "RemoveContainer" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" Feb 21 22:04:57 crc kubenswrapper[4717]: E0221 22:04:57.254601 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240\": container with ID starting with d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240 not found: ID does not exist" containerID="d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.254640 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240"} err="failed to get container status \"d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240\": rpc error: code = NotFound desc = could not find container \"d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240\": container with ID starting with d00cd621254ec700f56808e1ebafff4cd255df1f7ddb0a80c6a02e8a5cdf0240 not found: ID does not exist" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.266622 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71825248-cfe0-44cf-90d9-ab17c019328d" (UID: "71825248-cfe0-44cf-90d9-ab17c019328d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.308932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-logs\") pod \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.308988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-combined-ca-bundle\") pod \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309022 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-config-data\") pod \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309165 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dg7\" (UniqueName: \"kubernetes.io/projected/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-kube-api-access-t9dg7\") pod \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\" (UID: \"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e\") " Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309581 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-logs" (OuterVolumeSpecName: "logs") pod "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" (UID: "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309719 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309744 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309758 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71825248-cfe0-44cf-90d9-ab17c019328d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.309771 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2chw\" (UniqueName: \"kubernetes.io/projected/71825248-cfe0-44cf-90d9-ab17c019328d-kube-api-access-p2chw\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.315246 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-kube-api-access-t9dg7" (OuterVolumeSpecName: "kube-api-access-t9dg7") pod "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" (UID: "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e"). InnerVolumeSpecName "kube-api-access-t9dg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.336318 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" (UID: "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.342189 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-config-data" (OuterVolumeSpecName: "config-data") pod "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" (UID: "810f167b-3b42-4a4d-b4ae-db15cc0ebb4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.411577 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.411613 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.411623 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9dg7\" (UniqueName: \"kubernetes.io/projected/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e-kube-api-access-t9dg7\") on node \"crc\" DevicePath \"\"" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.528538 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.538253 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547152 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:57 crc kubenswrapper[4717]: E0221 22:04:57.547495 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-log" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547509 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-log" Feb 21 22:04:57 crc kubenswrapper[4717]: E0221 22:04:57.547534 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71825248-cfe0-44cf-90d9-ab17c019328d" containerName="nova-scheduler-scheduler" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547541 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="71825248-cfe0-44cf-90d9-ab17c019328d" containerName="nova-scheduler-scheduler" Feb 21 22:04:57 crc kubenswrapper[4717]: E0221 22:04:57.547557 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-api" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547564 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-api" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547740 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-api" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547760 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" containerName="nova-api-log" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.547770 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="71825248-cfe0-44cf-90d9-ab17c019328d" containerName="nova-scheduler-scheduler" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.548341 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.550841 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.562685 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.616692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.616728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-config-data\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.616905 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fqhg\" (UniqueName: \"kubernetes.io/projected/c5b032c8-f876-4d16-96f2-0058013b38a3-kube-api-access-7fqhg\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.718558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqhg\" (UniqueName: \"kubernetes.io/projected/c5b032c8-f876-4d16-96f2-0058013b38a3-kube-api-access-7fqhg\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.718930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.718953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-config-data\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.724602 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-config-data\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.725654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.738751 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fqhg\" (UniqueName: \"kubernetes.io/projected/c5b032c8-f876-4d16-96f2-0058013b38a3-kube-api-access-7fqhg\") pod \"nova-scheduler-0\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.912966 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:04:57 crc kubenswrapper[4717]: I0221 22:04:57.988958 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71825248-cfe0-44cf-90d9-ab17c019328d" path="/var/lib/kubelet/pods/71825248-cfe0-44cf-90d9-ab17c019328d/volumes" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.013997 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.014253 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.199902 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.199918 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"810f167b-3b42-4a4d-b4ae-db15cc0ebb4e","Type":"ContainerDied","Data":"339b6aa707e66be825d19ab9ee4898166ee4fe5fb1d80a3c1d543d1da37ffa60"} Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.200366 4717 scope.go:117] "RemoveContainer" containerID="b896ceeb989112a80b1a6240c22dc312139bb511e14f890f358b087c229d0eec" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.224558 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.238723 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.244301 4717 scope.go:117] "RemoveContainer" containerID="e60d78b24480d96463df5ca472a07a28120d4cab4e7149f95088660dc4454286" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.250547 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.252306 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.256239 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.279960 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.341127 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rsw\" (UniqueName: \"kubernetes.io/projected/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-kube-api-access-z4rsw\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.341189 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-config-data\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.341258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.341344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-logs\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: W0221 22:04:58.363025 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5b032c8_f876_4d16_96f2_0058013b38a3.slice/crio-c12f695aa9b9a725a8c4d8e1d54801aa235beae08dfc8e8f72d88749626f735b WatchSource:0}: Error finding container c12f695aa9b9a725a8c4d8e1d54801aa235beae08dfc8e8f72d88749626f735b: Status 404 returned error can't find the container with id c12f695aa9b9a725a8c4d8e1d54801aa235beae08dfc8e8f72d88749626f735b Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.364428 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.442775 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rsw\" (UniqueName: \"kubernetes.io/projected/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-kube-api-access-z4rsw\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.442826 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-config-data\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.442887 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.442933 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-logs\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.443638 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-logs\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.450775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.451028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-config-data\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.460027 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rsw\" (UniqueName: \"kubernetes.io/projected/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-kube-api-access-z4rsw\") pod \"nova-api-0\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " pod="openstack/nova-api-0" Feb 21 22:04:58 crc kubenswrapper[4717]: I0221 22:04:58.579264 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:04:59 crc kubenswrapper[4717]: I0221 22:04:59.165098 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:04:59 crc kubenswrapper[4717]: W0221 22:04:59.169076 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b0f7c0c_6a56_4423_9e79_2ac4519fd035.slice/crio-7efc203e8ed9360678578a6fffc5d52b1143fa805e04be93268cadf223e2b727 WatchSource:0}: Error finding container 7efc203e8ed9360678578a6fffc5d52b1143fa805e04be93268cadf223e2b727: Status 404 returned error can't find the container with id 7efc203e8ed9360678578a6fffc5d52b1143fa805e04be93268cadf223e2b727 Feb 21 22:04:59 crc kubenswrapper[4717]: I0221 22:04:59.213135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b0f7c0c-6a56-4423-9e79-2ac4519fd035","Type":"ContainerStarted","Data":"7efc203e8ed9360678578a6fffc5d52b1143fa805e04be93268cadf223e2b727"} Feb 21 22:04:59 crc kubenswrapper[4717]: I0221 22:04:59.216150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5b032c8-f876-4d16-96f2-0058013b38a3","Type":"ContainerStarted","Data":"d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1"} Feb 21 22:04:59 crc kubenswrapper[4717]: I0221 22:04:59.216201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5b032c8-f876-4d16-96f2-0058013b38a3","Type":"ContainerStarted","Data":"c12f695aa9b9a725a8c4d8e1d54801aa235beae08dfc8e8f72d88749626f735b"} Feb 21 22:04:59 crc kubenswrapper[4717]: I0221 22:04:59.233610 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.233589838 podStartE2EDuration="2.233589838s" podCreationTimestamp="2026-02-21 22:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:04:59.230715439 +0000 UTC m=+1114.012249081" watchObservedRunningTime="2026-02-21 22:04:59.233589838 +0000 UTC m=+1114.015123460" Feb 21 22:04:59 crc kubenswrapper[4717]: I0221 22:04:59.987405 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810f167b-3b42-4a4d-b4ae-db15cc0ebb4e" path="/var/lib/kubelet/pods/810f167b-3b42-4a4d-b4ae-db15cc0ebb4e/volumes" Feb 21 22:05:00 crc kubenswrapper[4717]: I0221 22:05:00.232163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b0f7c0c-6a56-4423-9e79-2ac4519fd035","Type":"ContainerStarted","Data":"5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85"} Feb 21 22:05:00 crc kubenswrapper[4717]: I0221 22:05:00.232219 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b0f7c0c-6a56-4423-9e79-2ac4519fd035","Type":"ContainerStarted","Data":"0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a"} Feb 21 22:05:00 crc kubenswrapper[4717]: I0221 22:05:00.256636 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.256618331 podStartE2EDuration="2.256618331s" podCreationTimestamp="2026-02-21 22:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:00.253596269 +0000 UTC m=+1115.035129921" watchObservedRunningTime="2026-02-21 22:05:00.256618331 +0000 UTC m=+1115.038151953" Feb 21 22:05:01 crc kubenswrapper[4717]: I0221 22:05:01.141200 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 21 22:05:02 crc kubenswrapper[4717]: I0221 22:05:02.570049 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 21 22:05:02 crc kubenswrapper[4717]: I0221 22:05:02.913179 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 21 22:05:03 crc kubenswrapper[4717]: I0221 22:05:03.014011 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 21 22:05:03 crc kubenswrapper[4717]: I0221 22:05:03.014075 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 21 22:05:04 crc kubenswrapper[4717]: I0221 22:05:04.066096 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:04 crc kubenswrapper[4717]: I0221 22:05:04.066105 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:07 crc kubenswrapper[4717]: I0221 22:05:07.913937 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 21 22:05:07 crc kubenswrapper[4717]: I0221 22:05:07.962683 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 21 22:05:08 crc kubenswrapper[4717]: I0221 22:05:08.395647 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 21 22:05:08 crc kubenswrapper[4717]: I0221 22:05:08.581244 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 21 22:05:08 crc kubenswrapper[4717]: I0221 22:05:08.581313 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.062465 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.063330 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.063465 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.064183 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6ea5ddcf698b572f76b6bdcff7985d26f0ef62fef8084d68925d625b747dd34"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.064299 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://d6ea5ddcf698b572f76b6bdcff7985d26f0ef62fef8084d68925d625b747dd34" gracePeriod=600 Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.350277 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="d6ea5ddcf698b572f76b6bdcff7985d26f0ef62fef8084d68925d625b747dd34" exitCode=0 Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.350346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"d6ea5ddcf698b572f76b6bdcff7985d26f0ef62fef8084d68925d625b747dd34"} Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.350620 4717 scope.go:117] "RemoveContainer" containerID="2d284cad9372c32723c3911aa224f8fd37b88ced35957297bd0664e6eabafd92" Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.664173 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:09 crc kubenswrapper[4717]: I0221 22:05:09.664199 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:10 crc kubenswrapper[4717]: I0221 22:05:10.366911 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"d1519f9b66326a62378129e107d074b9de1f98d8964cf8073c190668b31d38eb"} Feb 21 22:05:13 crc kubenswrapper[4717]: I0221 22:05:13.020007 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 21 22:05:13 crc kubenswrapper[4717]: I0221 22:05:13.021341 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 21 22:05:13 crc kubenswrapper[4717]: I0221 22:05:13.030650 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 21 22:05:13 crc kubenswrapper[4717]: I0221 22:05:13.410807 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.442527 4717 generic.go:334] "Generic (PLEG): container finished" podID="e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" containerID="9b5a44b32186bf0ff77932e89566cf94762797a604a2c2cb8f1b7fb6f249a9f0" exitCode=137 Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.442605 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3","Type":"ContainerDied","Data":"9b5a44b32186bf0ff77932e89566cf94762797a604a2c2cb8f1b7fb6f249a9f0"} Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.443269 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3","Type":"ContainerDied","Data":"d84fe3f0e3eb0ee8a83ffa4c452348561d3827c3722a9a7a2705ae1520170e7b"} Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.443297 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d84fe3f0e3eb0ee8a83ffa4c452348561d3827c3722a9a7a2705ae1520170e7b" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.495512 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.653252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7k8w\" (UniqueName: \"kubernetes.io/projected/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-kube-api-access-r7k8w\") pod \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.653533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-config-data\") pod \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.653608 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-combined-ca-bundle\") pod \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\" (UID: \"e82c81fb-76a2-4dc1-b082-cafb14ae4dc3\") " Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.661647 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-kube-api-access-r7k8w" (OuterVolumeSpecName: "kube-api-access-r7k8w") pod "e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" (UID: "e82c81fb-76a2-4dc1-b082-cafb14ae4dc3"). InnerVolumeSpecName "kube-api-access-r7k8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.690354 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-config-data" (OuterVolumeSpecName: "config-data") pod "e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" (UID: "e82c81fb-76a2-4dc1-b082-cafb14ae4dc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.695616 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" (UID: "e82c81fb-76a2-4dc1-b082-cafb14ae4dc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.756807 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.756879 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7k8w\" (UniqueName: \"kubernetes.io/projected/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-kube-api-access-r7k8w\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:16 crc kubenswrapper[4717]: I0221 22:05:16.756898 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.454746 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.517129 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.532974 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.543341 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:05:17 crc kubenswrapper[4717]: E0221 22:05:17.543778 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" containerName="nova-cell1-novncproxy-novncproxy" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.543798 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" containerName="nova-cell1-novncproxy-novncproxy" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.544072 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" containerName="nova-cell1-novncproxy-novncproxy" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.544740 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.549267 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.549507 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.549642 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.564763 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.689397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.689486 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.689667 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.689926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.690035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqhgh\" (UniqueName: \"kubernetes.io/projected/7a7413e9-1833-491d-8d42-0ac926edea33-kube-api-access-sqhgh\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.792414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.792516 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.792662 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.792723 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqhgh\" (UniqueName: \"kubernetes.io/projected/7a7413e9-1833-491d-8d42-0ac926edea33-kube-api-access-sqhgh\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.792900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.798066 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.798654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.803502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.808936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7413e9-1833-491d-8d42-0ac926edea33-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.817799 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqhgh\" (UniqueName: \"kubernetes.io/projected/7a7413e9-1833-491d-8d42-0ac926edea33-kube-api-access-sqhgh\") pod \"nova-cell1-novncproxy-0\" (UID: \"7a7413e9-1833-491d-8d42-0ac926edea33\") " pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:17 crc kubenswrapper[4717]: I0221 22:05:17.915800 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.005350 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82c81fb-76a2-4dc1-b082-cafb14ae4dc3" path="/var/lib/kubelet/pods/e82c81fb-76a2-4dc1-b082-cafb14ae4dc3/volumes" Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.413603 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 21 22:05:18 crc kubenswrapper[4717]: W0221 22:05:18.420355 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a7413e9_1833_491d_8d42_0ac926edea33.slice/crio-4eb30adc4af3d3f64b1463f016ccc45316b562539b96881ce7a5a39dcf1941ed WatchSource:0}: Error finding container 4eb30adc4af3d3f64b1463f016ccc45316b562539b96881ce7a5a39dcf1941ed: Status 404 returned error can't find the container with id 4eb30adc4af3d3f64b1463f016ccc45316b562539b96881ce7a5a39dcf1941ed Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.469362 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7a7413e9-1833-491d-8d42-0ac926edea33","Type":"ContainerStarted","Data":"4eb30adc4af3d3f64b1463f016ccc45316b562539b96881ce7a5a39dcf1941ed"} Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.584547 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.585209 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.586303 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 21 22:05:18 crc kubenswrapper[4717]: I0221 22:05:18.589566 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.484595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7a7413e9-1833-491d-8d42-0ac926edea33","Type":"ContainerStarted","Data":"f4be8a0f7c9e1530008ef6dadea1f1bde98a796cad977b0803f9669c680a1be9"} Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.485327 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.492165 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.517743 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.517706689 podStartE2EDuration="2.517706689s" podCreationTimestamp="2026-02-21 22:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:19.512917765 +0000 UTC m=+1134.294451397" watchObservedRunningTime="2026-02-21 22:05:19.517706689 +0000 UTC m=+1134.299240351" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.740329 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-82jvl"] Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.742237 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.758740 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-82jvl"] Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.835830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-config\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.835917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.835962 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.836002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.836023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.836067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcc5f\" (UniqueName: \"kubernetes.io/projected/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-kube-api-access-pcc5f\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.938205 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.938734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.939408 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.939575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.939908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcc5f\" (UniqueName: \"kubernetes.io/projected/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-kube-api-access-pcc5f\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.940365 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-config\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.940970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-config\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.941151 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.941291 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.941892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.942004 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:19 crc kubenswrapper[4717]: I0221 22:05:19.970285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcc5f\" (UniqueName: \"kubernetes.io/projected/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-kube-api-access-pcc5f\") pod \"dnsmasq-dns-59cf4bdb65-82jvl\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:20 crc kubenswrapper[4717]: I0221 22:05:20.069230 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:20 crc kubenswrapper[4717]: I0221 22:05:20.526736 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-82jvl"] Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.502073 4717 generic.go:334] "Generic (PLEG): container finished" podID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerID="88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60" exitCode=0 Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.502160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" event={"ID":"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8","Type":"ContainerDied","Data":"88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60"} Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.502214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" event={"ID":"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8","Type":"ContainerStarted","Data":"7ebfd5a7e70aa2b41da406478a2f88f1478b8bc92937e5eecb8214a54b263dbd"} Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.648031 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.648550 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-central-agent" containerID="cri-o://1fc3990da8a449c797db79a0daf55c07976f37c64f662b1070c8a9ccfb737cb1" gracePeriod=30 Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.648699 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="proxy-httpd" containerID="cri-o://58c669d76d700050ee913328cb57d888861f91b56580971ac93a34a27dbf6b17" gracePeriod=30 Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.648746 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="sg-core" containerID="cri-o://05a5b8f2fbc304dad253dac2d9baf95e1ff5e923043963e5682dae0812be1e12" gracePeriod=30 Feb 21 22:05:21 crc kubenswrapper[4717]: I0221 22:05:21.648774 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-notification-agent" containerID="cri-o://d45b0655569eb54f03b33d2ed524b2447f3616daf5c141674884248c96e41f3d" gracePeriod=30 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.117152 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534607 4717 generic.go:334] "Generic (PLEG): container finished" podID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerID="58c669d76d700050ee913328cb57d888861f91b56580971ac93a34a27dbf6b17" exitCode=0 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534855 4717 generic.go:334] "Generic (PLEG): container finished" podID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerID="05a5b8f2fbc304dad253dac2d9baf95e1ff5e923043963e5682dae0812be1e12" exitCode=2 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534884 4717 generic.go:334] "Generic (PLEG): container finished" podID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerID="d45b0655569eb54f03b33d2ed524b2447f3616daf5c141674884248c96e41f3d" exitCode=0 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534896 4717 generic.go:334] "Generic (PLEG): container finished" podID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerID="1fc3990da8a449c797db79a0daf55c07976f37c64f662b1070c8a9ccfb737cb1" exitCode=0 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534943 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerDied","Data":"58c669d76d700050ee913328cb57d888861f91b56580971ac93a34a27dbf6b17"} Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534972 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerDied","Data":"05a5b8f2fbc304dad253dac2d9baf95e1ff5e923043963e5682dae0812be1e12"} Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.534986 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerDied","Data":"d45b0655569eb54f03b33d2ed524b2447f3616daf5c141674884248c96e41f3d"} Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.535000 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerDied","Data":"1fc3990da8a449c797db79a0daf55c07976f37c64f662b1070c8a9ccfb737cb1"} Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.537132 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-log" containerID="cri-o://0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a" gracePeriod=30 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.538078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" event={"ID":"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8","Type":"ContainerStarted","Data":"c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74"} Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.538112 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.538503 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-api" containerID="cri-o://5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85" gracePeriod=30 Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.581300 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" podStartSLOduration=3.581276275 podStartE2EDuration="3.581276275s" podCreationTimestamp="2026-02-21 22:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:22.575134018 +0000 UTC m=+1137.356667640" watchObservedRunningTime="2026-02-21 22:05:22.581276275 +0000 UTC m=+1137.362809917" Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.825480 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:05:22 crc kubenswrapper[4717]: I0221 22:05:22.916164 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014049 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-combined-ca-bundle\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014206 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-run-httpd\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghd8n\" (UniqueName: \"kubernetes.io/projected/64cab48b-28c7-4f30-9e12-b3001c6a5d35-kube-api-access-ghd8n\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014285 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-log-httpd\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014369 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-scripts\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014416 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-sg-core-conf-yaml\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014661 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014892 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-config-data\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.014889 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.015077 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-ceilometer-tls-certs\") pod \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\" (UID: \"64cab48b-28c7-4f30-9e12-b3001c6a5d35\") " Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.015967 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.016252 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64cab48b-28c7-4f30-9e12-b3001c6a5d35-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.037011 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64cab48b-28c7-4f30-9e12-b3001c6a5d35-kube-api-access-ghd8n" (OuterVolumeSpecName: "kube-api-access-ghd8n") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "kube-api-access-ghd8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.039582 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-scripts" (OuterVolumeSpecName: "scripts") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.081737 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.083652 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.093003 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.117014 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghd8n\" (UniqueName: \"kubernetes.io/projected/64cab48b-28c7-4f30-9e12-b3001c6a5d35-kube-api-access-ghd8n\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.117040 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.117048 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.117057 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.117065 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.128723 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-config-data" (OuterVolumeSpecName: "config-data") pod "64cab48b-28c7-4f30-9e12-b3001c6a5d35" (UID: "64cab48b-28c7-4f30-9e12-b3001c6a5d35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.218538 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cab48b-28c7-4f30-9e12-b3001c6a5d35-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.547569 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64cab48b-28c7-4f30-9e12-b3001c6a5d35","Type":"ContainerDied","Data":"5754fda59fa5773cc79b9a2580c5680bf3e6cc8eca325624ae66457f14618c88"} Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.547618 4717 scope.go:117] "RemoveContainer" containerID="58c669d76d700050ee913328cb57d888861f91b56580971ac93a34a27dbf6b17" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.547654 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.549848 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerID="0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a" exitCode=143 Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.550096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b0f7c0c-6a56-4423-9e79-2ac4519fd035","Type":"ContainerDied","Data":"0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a"} Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.577150 4717 scope.go:117] "RemoveContainer" containerID="05a5b8f2fbc304dad253dac2d9baf95e1ff5e923043963e5682dae0812be1e12" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.597219 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.606602 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.615966 4717 scope.go:117] "RemoveContainer" containerID="d45b0655569eb54f03b33d2ed524b2447f3616daf5c141674884248c96e41f3d" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.626644 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:05:23 crc kubenswrapper[4717]: E0221 22:05:23.627115 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="proxy-httpd" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627139 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="proxy-httpd" Feb 21 22:05:23 crc kubenswrapper[4717]: E0221 22:05:23.627168 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-notification-agent" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627177 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-notification-agent" Feb 21 22:05:23 crc kubenswrapper[4717]: E0221 22:05:23.627189 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-central-agent" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627197 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-central-agent" Feb 21 22:05:23 crc kubenswrapper[4717]: E0221 22:05:23.627217 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="sg-core" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627224 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="sg-core" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627431 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="proxy-httpd" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627450 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-central-agent" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627467 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="sg-core" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.627482 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" containerName="ceilometer-notification-agent" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.629506 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.632604 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.632885 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.633487 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.636696 4717 scope.go:117] "RemoveContainer" containerID="1fc3990da8a449c797db79a0daf55c07976f37c64f662b1070c8a9ccfb737cb1" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.650973 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.728729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a3f1a6-792f-4679-84f0-55795cb63990-run-httpd\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.728777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.728813 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.728887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-scripts\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.728946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-config-data\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.728973 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5lmc\" (UniqueName: \"kubernetes.io/projected/08a3f1a6-792f-4679-84f0-55795cb63990-kube-api-access-m5lmc\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.729020 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.729058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a3f1a6-792f-4679-84f0-55795cb63990-log-httpd\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.832747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-scripts\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.832963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-config-data\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.833026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5lmc\" (UniqueName: \"kubernetes.io/projected/08a3f1a6-792f-4679-84f0-55795cb63990-kube-api-access-m5lmc\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.833099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.833183 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a3f1a6-792f-4679-84f0-55795cb63990-log-httpd\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.833398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a3f1a6-792f-4679-84f0-55795cb63990-run-httpd\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.833443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.833513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.835710 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a3f1a6-792f-4679-84f0-55795cb63990-log-httpd\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.840708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.840750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-config-data\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.841402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.841805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08a3f1a6-792f-4679-84f0-55795cb63990-run-httpd\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.842913 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.853805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08a3f1a6-792f-4679-84f0-55795cb63990-scripts\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.854309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5lmc\" (UniqueName: \"kubernetes.io/projected/08a3f1a6-792f-4679-84f0-55795cb63990-kube-api-access-m5lmc\") pod \"ceilometer-0\" (UID: \"08a3f1a6-792f-4679-84f0-55795cb63990\") " pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.961080 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 21 22:05:23 crc kubenswrapper[4717]: I0221 22:05:23.998222 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64cab48b-28c7-4f30-9e12-b3001c6a5d35" path="/var/lib/kubelet/pods/64cab48b-28c7-4f30-9e12-b3001c6a5d35/volumes" Feb 21 22:05:24 crc kubenswrapper[4717]: I0221 22:05:24.461765 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 21 22:05:24 crc kubenswrapper[4717]: I0221 22:05:24.562518 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08a3f1a6-792f-4679-84f0-55795cb63990","Type":"ContainerStarted","Data":"0086368c03efb31c7169dcc18722c15058c18bad88ca904472f1b17ddd382d36"} Feb 21 22:05:25 crc kubenswrapper[4717]: I0221 22:05:25.579203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08a3f1a6-792f-4679-84f0-55795cb63990","Type":"ContainerStarted","Data":"441cca1cf0b9d302af4545ab645ccc2f079a2cbd8eb039ab93e049a226897ec4"} Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.511036 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.592424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08a3f1a6-792f-4679-84f0-55795cb63990","Type":"ContainerStarted","Data":"7817a5e01a6f2e68ebe78957f7ff691e7cb764c17138bcc96a4b715097b71c5c"} Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.609062 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerID="5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85" exitCode=0 Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.609105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b0f7c0c-6a56-4423-9e79-2ac4519fd035","Type":"ContainerDied","Data":"5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85"} Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.609131 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b0f7c0c-6a56-4423-9e79-2ac4519fd035","Type":"ContainerDied","Data":"7efc203e8ed9360678578a6fffc5d52b1143fa805e04be93268cadf223e2b727"} Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.609146 4717 scope.go:117] "RemoveContainer" containerID="5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.609307 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.639951 4717 scope.go:117] "RemoveContainer" containerID="0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.662151 4717 scope.go:117] "RemoveContainer" containerID="5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85" Feb 21 22:05:26 crc kubenswrapper[4717]: E0221 22:05:26.662584 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85\": container with ID starting with 5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85 not found: ID does not exist" containerID="5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.662631 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85"} err="failed to get container status \"5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85\": rpc error: code = NotFound desc = could not find container \"5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85\": container with ID starting with 5e0ef0b6124f27113098a061522c619e4bd816ec02fd17aa0cc2f5e4c2d2ac85 not found: ID does not exist" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.662652 4717 scope.go:117] "RemoveContainer" containerID="0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a" Feb 21 22:05:26 crc kubenswrapper[4717]: E0221 22:05:26.663010 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a\": container with ID starting with 0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a not found: ID does not exist" containerID="0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.663087 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a"} err="failed to get container status \"0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a\": rpc error: code = NotFound desc = could not find container \"0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a\": container with ID starting with 0bd99ac19688b8300c10fc113636fbbb990bf61fd1829acf52a023ec8ca71f2a not found: ID does not exist" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.696327 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4rsw\" (UniqueName: \"kubernetes.io/projected/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-kube-api-access-z4rsw\") pod \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.696360 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-combined-ca-bundle\") pod \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.696415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-config-data\") pod \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.696460 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-logs\") pod \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\" (UID: \"9b0f7c0c-6a56-4423-9e79-2ac4519fd035\") " Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.697104 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-logs" (OuterVolumeSpecName: "logs") pod "9b0f7c0c-6a56-4423-9e79-2ac4519fd035" (UID: "9b0f7c0c-6a56-4423-9e79-2ac4519fd035"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.703188 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-kube-api-access-z4rsw" (OuterVolumeSpecName: "kube-api-access-z4rsw") pod "9b0f7c0c-6a56-4423-9e79-2ac4519fd035" (UID: "9b0f7c0c-6a56-4423-9e79-2ac4519fd035"). InnerVolumeSpecName "kube-api-access-z4rsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.724817 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-config-data" (OuterVolumeSpecName: "config-data") pod "9b0f7c0c-6a56-4423-9e79-2ac4519fd035" (UID: "9b0f7c0c-6a56-4423-9e79-2ac4519fd035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.733058 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b0f7c0c-6a56-4423-9e79-2ac4519fd035" (UID: "9b0f7c0c-6a56-4423-9e79-2ac4519fd035"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.798431 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.798479 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4rsw\" (UniqueName: \"kubernetes.io/projected/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-kube-api-access-z4rsw\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.798495 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.798510 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b0f7c0c-6a56-4423-9e79-2ac4519fd035-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.941564 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.980137 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.994764 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:26 crc kubenswrapper[4717]: E0221 22:05:26.995543 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-log" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.995564 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-log" Feb 21 22:05:26 crc kubenswrapper[4717]: E0221 22:05:26.995589 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-api" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.995596 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-api" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.996044 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-api" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.996074 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" containerName="nova-api-log" Feb 21 22:05:26 crc kubenswrapper[4717]: I0221 22:05:26.997842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.003185 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.003353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.003514 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.008576 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.115334 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-config-data\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.115418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.115540 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-public-tls-certs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.115560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed012f-4dd5-40fe-abff-9bd0621854c7-logs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.115578 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgwrl\" (UniqueName: \"kubernetes.io/projected/9bed012f-4dd5-40fe-abff-9bd0621854c7-kube-api-access-lgwrl\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.115654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.217277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-public-tls-certs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.217332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed012f-4dd5-40fe-abff-9bd0621854c7-logs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.217362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgwrl\" (UniqueName: \"kubernetes.io/projected/9bed012f-4dd5-40fe-abff-9bd0621854c7-kube-api-access-lgwrl\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.217458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.217505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-config-data\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.217559 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.218152 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed012f-4dd5-40fe-abff-9bd0621854c7-logs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.221173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-public-tls-certs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.221487 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.225404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-config-data\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.227558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.235742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgwrl\" (UniqueName: \"kubernetes.io/projected/9bed012f-4dd5-40fe-abff-9bd0621854c7-kube-api-access-lgwrl\") pod \"nova-api-0\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.324255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.624848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08a3f1a6-792f-4679-84f0-55795cb63990","Type":"ContainerStarted","Data":"bba4b30630d57b3d6dba928caa733b984540baeb824ba5d56eb3495440ee5383"} Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.814755 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.916064 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.950223 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:27 crc kubenswrapper[4717]: I0221 22:05:27.987762 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0f7c0c-6a56-4423-9e79-2ac4519fd035" path="/var/lib/kubelet/pods/9b0f7c0c-6a56-4423-9e79-2ac4519fd035/volumes" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.640052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9bed012f-4dd5-40fe-abff-9bd0621854c7","Type":"ContainerStarted","Data":"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc"} Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.640492 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9bed012f-4dd5-40fe-abff-9bd0621854c7","Type":"ContainerStarted","Data":"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1"} Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.640510 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9bed012f-4dd5-40fe-abff-9bd0621854c7","Type":"ContainerStarted","Data":"c911d17b591f0e9d948e149235032de84082388ab98d980ed62655cf72ca5e57"} Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.662013 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.661997001 podStartE2EDuration="2.661997001s" podCreationTimestamp="2026-02-21 22:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:28.657921122 +0000 UTC m=+1143.439454744" watchObservedRunningTime="2026-02-21 22:05:28.661997001 +0000 UTC m=+1143.443530623" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.665073 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.823553 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vqz8f"] Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.824775 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.826929 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.829212 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.834560 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vqz8f"] Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.951075 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpbd\" (UniqueName: \"kubernetes.io/projected/ed51d19d-c6f1-427b-b1de-d7e1debf9870-kube-api-access-6bpbd\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.951479 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-scripts\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.951537 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:28 crc kubenswrapper[4717]: I0221 22:05:28.951555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-config-data\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.053282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-scripts\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.053601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.053689 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-config-data\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.053832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpbd\" (UniqueName: \"kubernetes.io/projected/ed51d19d-c6f1-427b-b1de-d7e1debf9870-kube-api-access-6bpbd\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.058530 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.059083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-config-data\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.071689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-scripts\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.082577 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpbd\" (UniqueName: \"kubernetes.io/projected/ed51d19d-c6f1-427b-b1de-d7e1debf9870-kube-api-access-6bpbd\") pod \"nova-cell1-cell-mapping-vqz8f\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.147436 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.612221 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vqz8f"] Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.652335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vqz8f" event={"ID":"ed51d19d-c6f1-427b-b1de-d7e1debf9870","Type":"ContainerStarted","Data":"c84a6ec984664b3266e92f5787e213c7f83d9ad90c159027b8be6c5ed13a8280"} Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.657624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08a3f1a6-792f-4679-84f0-55795cb63990","Type":"ContainerStarted","Data":"0239ca79db4339d8ecc948e07db9718bd1436afdd8d944e757f88ad5bc0fa596"} Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.657766 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 21 22:05:29 crc kubenswrapper[4717]: I0221 22:05:29.704830 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.768732743 podStartE2EDuration="6.70481116s" podCreationTimestamp="2026-02-21 22:05:23 +0000 UTC" firstStartedPulling="2026-02-21 22:05:24.47054925 +0000 UTC m=+1139.252082902" lastFinishedPulling="2026-02-21 22:05:28.406627697 +0000 UTC m=+1143.188161319" observedRunningTime="2026-02-21 22:05:29.69193679 +0000 UTC m=+1144.473470412" watchObservedRunningTime="2026-02-21 22:05:29.70481116 +0000 UTC m=+1144.486344782" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.071974 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.157534 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-j9l2g"] Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.157795 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" containerName="dnsmasq-dns" containerID="cri-o://28ef9f785f1ee78a724ecc166a77b5fd7488d03cd2140a5e56a87961a98c89d5" gracePeriod=10 Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.673628 4717 generic.go:334] "Generic (PLEG): container finished" podID="b772fc71-6e74-4887-8278-43d737b82e9e" containerID="28ef9f785f1ee78a724ecc166a77b5fd7488d03cd2140a5e56a87961a98c89d5" exitCode=0 Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.674055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" event={"ID":"b772fc71-6e74-4887-8278-43d737b82e9e","Type":"ContainerDied","Data":"28ef9f785f1ee78a724ecc166a77b5fd7488d03cd2140a5e56a87961a98c89d5"} Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.674088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" event={"ID":"b772fc71-6e74-4887-8278-43d737b82e9e","Type":"ContainerDied","Data":"d761bd190202385fc13b631c19c035127fb3ababebb10aadd2af14f1a7a179f1"} Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.674102 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d761bd190202385fc13b631c19c035127fb3ababebb10aadd2af14f1a7a179f1" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.677592 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vqz8f" event={"ID":"ed51d19d-c6f1-427b-b1de-d7e1debf9870","Type":"ContainerStarted","Data":"1f0746c9b963d4516b13d5fade38b15c3fd00ff0246ea8713aa6955edb970b09"} Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.696928 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vqz8f" podStartSLOduration=2.696910379 podStartE2EDuration="2.696910379s" podCreationTimestamp="2026-02-21 22:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:30.690554365 +0000 UTC m=+1145.472087988" watchObservedRunningTime="2026-02-21 22:05:30.696910379 +0000 UTC m=+1145.478444021" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.698934 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.791061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-nb\") pod \"b772fc71-6e74-4887-8278-43d737b82e9e\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.791143 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-swift-storage-0\") pod \"b772fc71-6e74-4887-8278-43d737b82e9e\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.791249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-config\") pod \"b772fc71-6e74-4887-8278-43d737b82e9e\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.791284 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-svc\") pod \"b772fc71-6e74-4887-8278-43d737b82e9e\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.791310 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dll2l\" (UniqueName: \"kubernetes.io/projected/b772fc71-6e74-4887-8278-43d737b82e9e-kube-api-access-dll2l\") pod \"b772fc71-6e74-4887-8278-43d737b82e9e\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.791339 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-sb\") pod \"b772fc71-6e74-4887-8278-43d737b82e9e\" (UID: \"b772fc71-6e74-4887-8278-43d737b82e9e\") " Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.796577 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b772fc71-6e74-4887-8278-43d737b82e9e-kube-api-access-dll2l" (OuterVolumeSpecName: "kube-api-access-dll2l") pod "b772fc71-6e74-4887-8278-43d737b82e9e" (UID: "b772fc71-6e74-4887-8278-43d737b82e9e"). InnerVolumeSpecName "kube-api-access-dll2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.838602 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b772fc71-6e74-4887-8278-43d737b82e9e" (UID: "b772fc71-6e74-4887-8278-43d737b82e9e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.848477 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b772fc71-6e74-4887-8278-43d737b82e9e" (UID: "b772fc71-6e74-4887-8278-43d737b82e9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.848693 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-config" (OuterVolumeSpecName: "config") pod "b772fc71-6e74-4887-8278-43d737b82e9e" (UID: "b772fc71-6e74-4887-8278-43d737b82e9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.865138 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b772fc71-6e74-4887-8278-43d737b82e9e" (UID: "b772fc71-6e74-4887-8278-43d737b82e9e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.875494 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b772fc71-6e74-4887-8278-43d737b82e9e" (UID: "b772fc71-6e74-4887-8278-43d737b82e9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.893428 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.893455 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.893465 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dll2l\" (UniqueName: \"kubernetes.io/projected/b772fc71-6e74-4887-8278-43d737b82e9e-kube-api-access-dll2l\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.893475 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.893483 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:30 crc kubenswrapper[4717]: I0221 22:05:30.893490 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b772fc71-6e74-4887-8278-43d737b82e9e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:31 crc kubenswrapper[4717]: I0221 22:05:31.682887 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-j9l2g" Feb 21 22:05:31 crc kubenswrapper[4717]: I0221 22:05:31.719722 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-j9l2g"] Feb 21 22:05:31 crc kubenswrapper[4717]: I0221 22:05:31.728144 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-j9l2g"] Feb 21 22:05:31 crc kubenswrapper[4717]: I0221 22:05:31.988475 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" path="/var/lib/kubelet/pods/b772fc71-6e74-4887-8278-43d737b82e9e/volumes" Feb 21 22:05:34 crc kubenswrapper[4717]: I0221 22:05:34.739128 4717 generic.go:334] "Generic (PLEG): container finished" podID="ed51d19d-c6f1-427b-b1de-d7e1debf9870" containerID="1f0746c9b963d4516b13d5fade38b15c3fd00ff0246ea8713aa6955edb970b09" exitCode=0 Feb 21 22:05:34 crc kubenswrapper[4717]: I0221 22:05:34.739247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vqz8f" event={"ID":"ed51d19d-c6f1-427b-b1de-d7e1debf9870","Type":"ContainerDied","Data":"1f0746c9b963d4516b13d5fade38b15c3fd00ff0246ea8713aa6955edb970b09"} Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.225345 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.319645 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-config-data\") pod \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.320253 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-scripts\") pod \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.321133 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bpbd\" (UniqueName: \"kubernetes.io/projected/ed51d19d-c6f1-427b-b1de-d7e1debf9870-kube-api-access-6bpbd\") pod \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.321196 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-combined-ca-bundle\") pod \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\" (UID: \"ed51d19d-c6f1-427b-b1de-d7e1debf9870\") " Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.329281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed51d19d-c6f1-427b-b1de-d7e1debf9870-kube-api-access-6bpbd" (OuterVolumeSpecName: "kube-api-access-6bpbd") pod "ed51d19d-c6f1-427b-b1de-d7e1debf9870" (UID: "ed51d19d-c6f1-427b-b1de-d7e1debf9870"). InnerVolumeSpecName "kube-api-access-6bpbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.329424 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-scripts" (OuterVolumeSpecName: "scripts") pod "ed51d19d-c6f1-427b-b1de-d7e1debf9870" (UID: "ed51d19d-c6f1-427b-b1de-d7e1debf9870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.360165 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-config-data" (OuterVolumeSpecName: "config-data") pod "ed51d19d-c6f1-427b-b1de-d7e1debf9870" (UID: "ed51d19d-c6f1-427b-b1de-d7e1debf9870"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.363435 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed51d19d-c6f1-427b-b1de-d7e1debf9870" (UID: "ed51d19d-c6f1-427b-b1de-d7e1debf9870"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.423749 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.423816 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-scripts\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.423832 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bpbd\" (UniqueName: \"kubernetes.io/projected/ed51d19d-c6f1-427b-b1de-d7e1debf9870-kube-api-access-6bpbd\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.423845 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed51d19d-c6f1-427b-b1de-d7e1debf9870-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.762537 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vqz8f" event={"ID":"ed51d19d-c6f1-427b-b1de-d7e1debf9870","Type":"ContainerDied","Data":"c84a6ec984664b3266e92f5787e213c7f83d9ad90c159027b8be6c5ed13a8280"} Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.763595 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c84a6ec984664b3266e92f5787e213c7f83d9ad90c159027b8be6c5ed13a8280" Feb 21 22:05:36 crc kubenswrapper[4717]: I0221 22:05:36.763235 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vqz8f" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.001253 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.001725 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-log" containerID="cri-o://d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1" gracePeriod=30 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.002433 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-api" containerID="cri-o://3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc" gracePeriod=30 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.041588 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.042172 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c5b032c8-f876-4d16-96f2-0058013b38a3" containerName="nova-scheduler-scheduler" containerID="cri-o://d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" gracePeriod=30 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.056276 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.056551 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-log" containerID="cri-o://fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a" gracePeriod=30 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.056773 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-metadata" containerID="cri-o://1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc" gracePeriod=30 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.577515 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.645210 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgwrl\" (UniqueName: \"kubernetes.io/projected/9bed012f-4dd5-40fe-abff-9bd0621854c7-kube-api-access-lgwrl\") pod \"9bed012f-4dd5-40fe-abff-9bd0621854c7\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.645532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-internal-tls-certs\") pod \"9bed012f-4dd5-40fe-abff-9bd0621854c7\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.645599 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-public-tls-certs\") pod \"9bed012f-4dd5-40fe-abff-9bd0621854c7\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.645694 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-config-data\") pod \"9bed012f-4dd5-40fe-abff-9bd0621854c7\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.645728 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-combined-ca-bundle\") pod \"9bed012f-4dd5-40fe-abff-9bd0621854c7\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.645776 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed012f-4dd5-40fe-abff-9bd0621854c7-logs\") pod \"9bed012f-4dd5-40fe-abff-9bd0621854c7\" (UID: \"9bed012f-4dd5-40fe-abff-9bd0621854c7\") " Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.646276 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bed012f-4dd5-40fe-abff-9bd0621854c7-logs" (OuterVolumeSpecName: "logs") pod "9bed012f-4dd5-40fe-abff-9bd0621854c7" (UID: "9bed012f-4dd5-40fe-abff-9bd0621854c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.652525 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bed012f-4dd5-40fe-abff-9bd0621854c7-kube-api-access-lgwrl" (OuterVolumeSpecName: "kube-api-access-lgwrl") pod "9bed012f-4dd5-40fe-abff-9bd0621854c7" (UID: "9bed012f-4dd5-40fe-abff-9bd0621854c7"). InnerVolumeSpecName "kube-api-access-lgwrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.679491 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bed012f-4dd5-40fe-abff-9bd0621854c7" (UID: "9bed012f-4dd5-40fe-abff-9bd0621854c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.683296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-config-data" (OuterVolumeSpecName: "config-data") pod "9bed012f-4dd5-40fe-abff-9bd0621854c7" (UID: "9bed012f-4dd5-40fe-abff-9bd0621854c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.705114 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9bed012f-4dd5-40fe-abff-9bd0621854c7" (UID: "9bed012f-4dd5-40fe-abff-9bd0621854c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.716078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9bed012f-4dd5-40fe-abff-9bd0621854c7" (UID: "9bed012f-4dd5-40fe-abff-9bd0621854c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.747137 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.747302 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.747404 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed012f-4dd5-40fe-abff-9bd0621854c7-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.747462 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgwrl\" (UniqueName: \"kubernetes.io/projected/9bed012f-4dd5-40fe-abff-9bd0621854c7-kube-api-access-lgwrl\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.747515 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.747576 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bed012f-4dd5-40fe-abff-9bd0621854c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.771563 4717 generic.go:334] "Generic (PLEG): container finished" podID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerID="fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a" exitCode=143 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.771616 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b088db80-9bd4-4573-9762-45c2be00a3e3","Type":"ContainerDied","Data":"fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a"} Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773500 4717 generic.go:334] "Generic (PLEG): container finished" podID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerID="3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc" exitCode=0 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773522 4717 generic.go:334] "Generic (PLEG): container finished" podID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerID="d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1" exitCode=143 Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9bed012f-4dd5-40fe-abff-9bd0621854c7","Type":"ContainerDied","Data":"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc"} Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9bed012f-4dd5-40fe-abff-9bd0621854c7","Type":"ContainerDied","Data":"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1"} Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773560 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9bed012f-4dd5-40fe-abff-9bd0621854c7","Type":"ContainerDied","Data":"c911d17b591f0e9d948e149235032de84082388ab98d980ed62655cf72ca5e57"} Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773574 4717 scope.go:117] "RemoveContainer" containerID="3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.773679 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.813027 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.814684 4717 scope.go:117] "RemoveContainer" containerID="d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.833052 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.845817 4717 scope.go:117] "RemoveContainer" containerID="3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc" Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.846334 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc\": container with ID starting with 3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc not found: ID does not exist" containerID="3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.846450 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc"} err="failed to get container status \"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc\": rpc error: code = NotFound desc = could not find container \"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc\": container with ID starting with 3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc not found: ID does not exist" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.846528 4717 scope.go:117] "RemoveContainer" containerID="d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1" Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.847055 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1\": container with ID starting with d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1 not found: ID does not exist" containerID="d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.847098 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1"} err="failed to get container status \"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1\": rpc error: code = NotFound desc = could not find container \"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1\": container with ID starting with d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1 not found: ID does not exist" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.847119 4717 scope.go:117] "RemoveContainer" containerID="3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.848588 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc"} err="failed to get container status \"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc\": rpc error: code = NotFound desc = could not find container \"3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc\": container with ID starting with 3177cb724520af1c73a4d488396911688c484e13cf1ccedd5748e324549693cc not found: ID does not exist" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.848730 4717 scope.go:117] "RemoveContainer" containerID="d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.849161 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1"} err="failed to get container status \"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1\": rpc error: code = NotFound desc = could not find container \"d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1\": container with ID starting with d7eb8832be31d36e0e124ba3eb49b42891de30ef92e640c0491b6caa947556b1 not found: ID does not exist" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.868509 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.868941 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" containerName="init" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.868960 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" containerName="init" Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.868974 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed51d19d-c6f1-427b-b1de-d7e1debf9870" containerName="nova-manage" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.868980 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed51d19d-c6f1-427b-b1de-d7e1debf9870" containerName="nova-manage" Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.868993 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-log" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.868999 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-log" Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.869020 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-api" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.869026 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-api" Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.869033 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" containerName="dnsmasq-dns" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.869041 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" containerName="dnsmasq-dns" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.869195 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b772fc71-6e74-4887-8278-43d737b82e9e" containerName="dnsmasq-dns" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.869214 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed51d19d-c6f1-427b-b1de-d7e1debf9870" containerName="nova-manage" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.869222 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-log" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.869230 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" containerName="nova-api-api" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.870317 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.873936 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.874109 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.874123 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.890215 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.916144 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.917459 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.919206 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 21 22:05:37 crc kubenswrapper[4717]: E0221 22:05:37.919237 4717 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c5b032c8-f876-4d16-96f2-0058013b38a3" containerName="nova-scheduler-scheduler" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.951660 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.951960 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-config-data\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.952109 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.952214 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.952319 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-logs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.952424 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrlm\" (UniqueName: \"kubernetes.io/projected/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-kube-api-access-rzrlm\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:37 crc kubenswrapper[4717]: I0221 22:05:37.986103 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bed012f-4dd5-40fe-abff-9bd0621854c7" path="/var/lib/kubelet/pods/9bed012f-4dd5-40fe-abff-9bd0621854c7/volumes" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.054278 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.054552 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-config-data\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.054820 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.055184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.055370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-logs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.055762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-logs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.055879 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrlm\" (UniqueName: \"kubernetes.io/projected/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-kube-api-access-rzrlm\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.058640 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.058795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.059159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.059254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-config-data\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.073287 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrlm\" (UniqueName: \"kubernetes.io/projected/3b045abf-1d97-45e2-a8ed-ed13aedc19f7-kube-api-access-rzrlm\") pod \"nova-api-0\" (UID: \"3b045abf-1d97-45e2-a8ed-ed13aedc19f7\") " pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.203541 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.680514 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 21 22:05:38 crc kubenswrapper[4717]: W0221 22:05:38.685591 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b045abf_1d97_45e2_a8ed_ed13aedc19f7.slice/crio-f872b04a7416bb5dba5a1c105c7fe6c0ff4969b00481781f90df35a568ddb0dd WatchSource:0}: Error finding container f872b04a7416bb5dba5a1c105c7fe6c0ff4969b00481781f90df35a568ddb0dd: Status 404 returned error can't find the container with id f872b04a7416bb5dba5a1c105c7fe6c0ff4969b00481781f90df35a568ddb0dd Feb 21 22:05:38 crc kubenswrapper[4717]: I0221 22:05:38.795485 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b045abf-1d97-45e2-a8ed-ed13aedc19f7","Type":"ContainerStarted","Data":"f872b04a7416bb5dba5a1c105c7fe6c0ff4969b00481781f90df35a568ddb0dd"} Feb 21 22:05:39 crc kubenswrapper[4717]: I0221 22:05:39.816801 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b045abf-1d97-45e2-a8ed-ed13aedc19f7","Type":"ContainerStarted","Data":"bb8b0eb97c98a2212a556d28fb02fda2298d7faacc321bffdccb137765b9f2d8"} Feb 21 22:05:39 crc kubenswrapper[4717]: I0221 22:05:39.817236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b045abf-1d97-45e2-a8ed-ed13aedc19f7","Type":"ContainerStarted","Data":"0ef39c62abe071da2a9156f934a1dd8d1e0484f9724eab764315c89205749ea7"} Feb 21 22:05:39 crc kubenswrapper[4717]: I0221 22:05:39.872578 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.872551664 podStartE2EDuration="2.872551664s" podCreationTimestamp="2026-02-21 22:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:39.847360338 +0000 UTC m=+1154.628894010" watchObservedRunningTime="2026-02-21 22:05:39.872551664 +0000 UTC m=+1154.654085326" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.199452 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:53244->10.217.0.199:8775: read: connection reset by peer" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.199461 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:53242->10.217.0.199:8775: read: connection reset by peer" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.729441 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.807937 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xh6\" (UniqueName: \"kubernetes.io/projected/b088db80-9bd4-4573-9762-45c2be00a3e3-kube-api-access-s9xh6\") pod \"b088db80-9bd4-4573-9762-45c2be00a3e3\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.808001 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-config-data\") pod \"b088db80-9bd4-4573-9762-45c2be00a3e3\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.808255 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b088db80-9bd4-4573-9762-45c2be00a3e3-logs\") pod \"b088db80-9bd4-4573-9762-45c2be00a3e3\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.808318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-nova-metadata-tls-certs\") pod \"b088db80-9bd4-4573-9762-45c2be00a3e3\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.808343 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-combined-ca-bundle\") pod \"b088db80-9bd4-4573-9762-45c2be00a3e3\" (UID: \"b088db80-9bd4-4573-9762-45c2be00a3e3\") " Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.810725 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b088db80-9bd4-4573-9762-45c2be00a3e3-logs" (OuterVolumeSpecName: "logs") pod "b088db80-9bd4-4573-9762-45c2be00a3e3" (UID: "b088db80-9bd4-4573-9762-45c2be00a3e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.824265 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b088db80-9bd4-4573-9762-45c2be00a3e3-kube-api-access-s9xh6" (OuterVolumeSpecName: "kube-api-access-s9xh6") pod "b088db80-9bd4-4573-9762-45c2be00a3e3" (UID: "b088db80-9bd4-4573-9762-45c2be00a3e3"). InnerVolumeSpecName "kube-api-access-s9xh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.837761 4717 generic.go:334] "Generic (PLEG): container finished" podID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerID="1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc" exitCode=0 Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.838506 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.838508 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b088db80-9bd4-4573-9762-45c2be00a3e3","Type":"ContainerDied","Data":"1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc"} Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.838579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b088db80-9bd4-4573-9762-45c2be00a3e3","Type":"ContainerDied","Data":"935ebcf2b02276c33f68e767b5928bce4b7e849e5b21362ee6e1c7313f0a1340"} Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.838601 4717 scope.go:117] "RemoveContainer" containerID="1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.850899 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-config-data" (OuterVolumeSpecName: "config-data") pod "b088db80-9bd4-4573-9762-45c2be00a3e3" (UID: "b088db80-9bd4-4573-9762-45c2be00a3e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.854011 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b088db80-9bd4-4573-9762-45c2be00a3e3" (UID: "b088db80-9bd4-4573-9762-45c2be00a3e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.873052 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b088db80-9bd4-4573-9762-45c2be00a3e3" (UID: "b088db80-9bd4-4573-9762-45c2be00a3e3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.910067 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b088db80-9bd4-4573-9762-45c2be00a3e3-logs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.910294 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.910377 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.910452 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xh6\" (UniqueName: \"kubernetes.io/projected/b088db80-9bd4-4573-9762-45c2be00a3e3-kube-api-access-s9xh6\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.910513 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088db80-9bd4-4573-9762-45c2be00a3e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.924742 4717 scope.go:117] "RemoveContainer" containerID="fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.943105 4717 scope.go:117] "RemoveContainer" containerID="1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc" Feb 21 22:05:40 crc kubenswrapper[4717]: E0221 22:05:40.943670 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc\": container with ID starting with 1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc not found: ID does not exist" containerID="1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.943729 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc"} err="failed to get container status \"1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc\": rpc error: code = NotFound desc = could not find container \"1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc\": container with ID starting with 1e95d672692ff29095340711071b6284f3a4195da4f5b5d5231cae38f2ca92cc not found: ID does not exist" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.943766 4717 scope.go:117] "RemoveContainer" containerID="fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a" Feb 21 22:05:40 crc kubenswrapper[4717]: E0221 22:05:40.944372 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a\": container with ID starting with fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a not found: ID does not exist" containerID="fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a" Feb 21 22:05:40 crc kubenswrapper[4717]: I0221 22:05:40.944406 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a"} err="failed to get container status \"fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a\": rpc error: code = NotFound desc = could not find container \"fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a\": container with ID starting with fef405cc0ca65fff4777fcca547bd4744c91919178be0bd5c0951256ed9cae7a not found: ID does not exist" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.188945 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.199308 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.222093 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: E0221 22:05:41.222541 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-log" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.222562 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-log" Feb 21 22:05:41 crc kubenswrapper[4717]: E0221 22:05:41.222576 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-metadata" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.222587 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-metadata" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.222822 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-metadata" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.222872 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" containerName="nova-metadata-log" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.224032 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.227246 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.227747 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.251062 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.320880 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8a58dd-e035-414e-b133-160ea01477aa-logs\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.320994 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.321039 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.321081 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-config-data\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.321112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78x5x\" (UniqueName: \"kubernetes.io/projected/5e8a58dd-e035-414e-b133-160ea01477aa-kube-api-access-78x5x\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.422630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.422950 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.423074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-config-data\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.423173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78x5x\" (UniqueName: \"kubernetes.io/projected/5e8a58dd-e035-414e-b133-160ea01477aa-kube-api-access-78x5x\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.423299 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8a58dd-e035-414e-b133-160ea01477aa-logs\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.423534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e8a58dd-e035-414e-b133-160ea01477aa-logs\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.426391 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-config-data\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.426878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.426957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8a58dd-e035-414e-b133-160ea01477aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.439510 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78x5x\" (UniqueName: \"kubernetes.io/projected/5e8a58dd-e035-414e-b133-160ea01477aa-kube-api-access-78x5x\") pod \"nova-metadata-0\" (UID: \"5e8a58dd-e035-414e-b133-160ea01477aa\") " pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.545367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.710171 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.728466 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-combined-ca-bundle\") pod \"c5b032c8-f876-4d16-96f2-0058013b38a3\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.730103 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fqhg\" (UniqueName: \"kubernetes.io/projected/c5b032c8-f876-4d16-96f2-0058013b38a3-kube-api-access-7fqhg\") pod \"c5b032c8-f876-4d16-96f2-0058013b38a3\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.730222 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-config-data\") pod \"c5b032c8-f876-4d16-96f2-0058013b38a3\" (UID: \"c5b032c8-f876-4d16-96f2-0058013b38a3\") " Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.736570 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b032c8-f876-4d16-96f2-0058013b38a3-kube-api-access-7fqhg" (OuterVolumeSpecName: "kube-api-access-7fqhg") pod "c5b032c8-f876-4d16-96f2-0058013b38a3" (UID: "c5b032c8-f876-4d16-96f2-0058013b38a3"). InnerVolumeSpecName "kube-api-access-7fqhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.772398 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5b032c8-f876-4d16-96f2-0058013b38a3" (UID: "c5b032c8-f876-4d16-96f2-0058013b38a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.776540 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-config-data" (OuterVolumeSpecName: "config-data") pod "c5b032c8-f876-4d16-96f2-0058013b38a3" (UID: "c5b032c8-f876-4d16-96f2-0058013b38a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.832719 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.832764 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fqhg\" (UniqueName: \"kubernetes.io/projected/c5b032c8-f876-4d16-96f2-0058013b38a3-kube-api-access-7fqhg\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.832782 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b032c8-f876-4d16-96f2-0058013b38a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.861092 4717 generic.go:334] "Generic (PLEG): container finished" podID="c5b032c8-f876-4d16-96f2-0058013b38a3" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" exitCode=0 Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.861130 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.861127 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5b032c8-f876-4d16-96f2-0058013b38a3","Type":"ContainerDied","Data":"d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1"} Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.861173 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c5b032c8-f876-4d16-96f2-0058013b38a3","Type":"ContainerDied","Data":"c12f695aa9b9a725a8c4d8e1d54801aa235beae08dfc8e8f72d88749626f735b"} Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.861190 4717 scope.go:117] "RemoveContainer" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.886983 4717 scope.go:117] "RemoveContainer" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" Feb 21 22:05:41 crc kubenswrapper[4717]: E0221 22:05:41.887514 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1\": container with ID starting with d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1 not found: ID does not exist" containerID="d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.887542 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1"} err="failed to get container status \"d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1\": rpc error: code = NotFound desc = could not find container \"d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1\": container with ID starting with d40354f5be8264d5e07e37a22594ad85285c26813188dfa8132383d9e14743f1 not found: ID does not exist" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.906819 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.915378 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.945195 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: E0221 22:05:41.946099 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b032c8-f876-4d16-96f2-0058013b38a3" containerName="nova-scheduler-scheduler" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.946200 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b032c8-f876-4d16-96f2-0058013b38a3" containerName="nova-scheduler-scheduler" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.946779 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b032c8-f876-4d16-96f2-0058013b38a3" containerName="nova-scheduler-scheduler" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.947780 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.956215 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.963636 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.988578 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b088db80-9bd4-4573-9762-45c2be00a3e3" path="/var/lib/kubelet/pods/b088db80-9bd4-4573-9762-45c2be00a3e3/volumes" Feb 21 22:05:41 crc kubenswrapper[4717]: I0221 22:05:41.989491 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b032c8-f876-4d16-96f2-0058013b38a3" path="/var/lib/kubelet/pods/c5b032c8-f876-4d16-96f2-0058013b38a3/volumes" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.036418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14550721-34cc-41f8-a5f0-e15e73cf2983-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.036768 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14550721-34cc-41f8-a5f0-e15e73cf2983-config-data\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.037621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccm8p\" (UniqueName: \"kubernetes.io/projected/14550721-34cc-41f8-a5f0-e15e73cf2983-kube-api-access-ccm8p\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.075050 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.140414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14550721-34cc-41f8-a5f0-e15e73cf2983-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.140505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14550721-34cc-41f8-a5f0-e15e73cf2983-config-data\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.140553 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccm8p\" (UniqueName: \"kubernetes.io/projected/14550721-34cc-41f8-a5f0-e15e73cf2983-kube-api-access-ccm8p\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.147137 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14550721-34cc-41f8-a5f0-e15e73cf2983-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.147402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14550721-34cc-41f8-a5f0-e15e73cf2983-config-data\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.157836 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccm8p\" (UniqueName: \"kubernetes.io/projected/14550721-34cc-41f8-a5f0-e15e73cf2983-kube-api-access-ccm8p\") pod \"nova-scheduler-0\" (UID: \"14550721-34cc-41f8-a5f0-e15e73cf2983\") " pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.274552 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.869511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e8a58dd-e035-414e-b133-160ea01477aa","Type":"ContainerStarted","Data":"2128e159e5b78477a83ac14a09430576d57d1fb8afd73e50e9d506a822f9a24d"} Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.869925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e8a58dd-e035-414e-b133-160ea01477aa","Type":"ContainerStarted","Data":"80aece25fb0f6a11dbdd50ff826d8503588a30487456b86ce9a7c6ae10b27e6b"} Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.869943 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e8a58dd-e035-414e-b133-160ea01477aa","Type":"ContainerStarted","Data":"0f959730a1b25f0acd25218abb89a1f2f25a5b5aadc90c90db16da15fdd44f91"} Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.890988 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.890961494 podStartE2EDuration="1.890961494s" podCreationTimestamp="2026-02-21 22:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:42.883202526 +0000 UTC m=+1157.664736148" watchObservedRunningTime="2026-02-21 22:05:42.890961494 +0000 UTC m=+1157.672495136" Feb 21 22:05:42 crc kubenswrapper[4717]: I0221 22:05:42.965575 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 21 22:05:43 crc kubenswrapper[4717]: I0221 22:05:43.885344 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14550721-34cc-41f8-a5f0-e15e73cf2983","Type":"ContainerStarted","Data":"5f89db01915eb0110384061f8025b2b5e024704aeb171751b9215f5833cc02cc"} Feb 21 22:05:43 crc kubenswrapper[4717]: I0221 22:05:43.885965 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"14550721-34cc-41f8-a5f0-e15e73cf2983","Type":"ContainerStarted","Data":"59ee89596ff40662fb6d4484cd2993f9e59e7bcc797064d74ef13a86c914fe11"} Feb 21 22:05:43 crc kubenswrapper[4717]: I0221 22:05:43.903796 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9037754099999997 podStartE2EDuration="2.90377541s" podCreationTimestamp="2026-02-21 22:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:05:43.901534827 +0000 UTC m=+1158.683068439" watchObservedRunningTime="2026-02-21 22:05:43.90377541 +0000 UTC m=+1158.685309032" Feb 21 22:05:46 crc kubenswrapper[4717]: I0221 22:05:46.546034 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 21 22:05:46 crc kubenswrapper[4717]: I0221 22:05:46.546418 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 21 22:05:47 crc kubenswrapper[4717]: I0221 22:05:47.275734 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 21 22:05:48 crc kubenswrapper[4717]: I0221 22:05:48.204368 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 21 22:05:48 crc kubenswrapper[4717]: I0221 22:05:48.204768 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 21 22:05:49 crc kubenswrapper[4717]: I0221 22:05:49.219089 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b045abf-1d97-45e2-a8ed-ed13aedc19f7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:49 crc kubenswrapper[4717]: I0221 22:05:49.219150 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b045abf-1d97-45e2-a8ed-ed13aedc19f7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:51 crc kubenswrapper[4717]: I0221 22:05:51.546568 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 21 22:05:51 crc kubenswrapper[4717]: I0221 22:05:51.548746 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 21 22:05:52 crc kubenswrapper[4717]: I0221 22:05:52.275976 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 21 22:05:52 crc kubenswrapper[4717]: I0221 22:05:52.310492 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 21 22:05:52 crc kubenswrapper[4717]: I0221 22:05:52.547024 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e8a58dd-e035-414e-b133-160ea01477aa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:52 crc kubenswrapper[4717]: I0221 22:05:52.559122 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e8a58dd-e035-414e-b133-160ea01477aa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 21 22:05:53 crc kubenswrapper[4717]: I0221 22:05:53.058166 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 21 22:05:53 crc kubenswrapper[4717]: I0221 22:05:53.973360 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 21 22:05:58 crc kubenswrapper[4717]: I0221 22:05:58.213327 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 21 22:05:58 crc kubenswrapper[4717]: I0221 22:05:58.214503 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 21 22:05:58 crc kubenswrapper[4717]: I0221 22:05:58.215051 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 21 22:05:58 crc kubenswrapper[4717]: I0221 22:05:58.228457 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 21 22:05:59 crc kubenswrapper[4717]: I0221 22:05:59.088471 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 21 22:05:59 crc kubenswrapper[4717]: I0221 22:05:59.098612 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 21 22:06:01 crc kubenswrapper[4717]: I0221 22:06:01.554020 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 21 22:06:01 crc kubenswrapper[4717]: I0221 22:06:01.554387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 21 22:06:01 crc kubenswrapper[4717]: I0221 22:06:01.566395 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 21 22:06:01 crc kubenswrapper[4717]: I0221 22:06:01.567909 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 21 22:06:09 crc kubenswrapper[4717]: I0221 22:06:09.377853 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:06:10 crc kubenswrapper[4717]: I0221 22:06:10.550997 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:06:13 crc kubenswrapper[4717]: I0221 22:06:13.517541 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerName="rabbitmq" containerID="cri-o://59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5" gracePeriod=604796 Feb 21 22:06:14 crc kubenswrapper[4717]: I0221 22:06:14.526411 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerName="rabbitmq" containerID="cri-o://25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b" gracePeriod=604797 Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.119249 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148497 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-erlang-cookie-secret\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148566 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-config-data\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148685 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-plugins-conf\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-server-conf\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148739 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-erlang-cookie\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148788 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-pod-info\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148814 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-tls\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148830 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148881 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-plugins\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148922 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-confd\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.148943 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5nkt\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-kube-api-access-w5nkt\") pod \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\" (UID: \"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd\") " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.150385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.150929 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.151363 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.158076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.159537 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-kube-api-access-w5nkt" (OuterVolumeSpecName: "kube-api-access-w5nkt") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "kube-api-access-w5nkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.159809 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-pod-info" (OuterVolumeSpecName: "pod-info") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.167630 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.192973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.223811 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-config-data" (OuterVolumeSpecName: "config-data") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251108 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-pod-info\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251144 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251181 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251193 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251206 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5nkt\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-kube-api-access-w5nkt\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251217 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251228 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251238 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.251249 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.303718 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.312703 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-server-conf" (OuterVolumeSpecName: "server-conf") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.345891 4717 generic.go:334] "Generic (PLEG): container finished" podID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerID="59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5" exitCode=0 Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.345941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd","Type":"ContainerDied","Data":"59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5"} Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.345980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c030d9bf-a8c2-4dc0-996b-82ed1214d4bd","Type":"ContainerDied","Data":"6a59bb117181333263825fa147402842572afce74fbe6a63942845cd545015c8"} Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.345999 4717 scope.go:117] "RemoveContainer" containerID="59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.346167 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.353220 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-blkrj"] Feb 21 22:06:20 crc kubenswrapper[4717]: E0221 22:06:20.353575 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerName="setup-container" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.353586 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerName="setup-container" Feb 21 22:06:20 crc kubenswrapper[4717]: E0221 22:06:20.353620 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerName="rabbitmq" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.353626 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerName="rabbitmq" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.353783 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" containerName="rabbitmq" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.355603 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.355615 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-server-conf\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.356278 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.363625 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" (UID: "c030d9bf-a8c2-4dc0-996b-82ed1214d4bd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.363811 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.364345 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-blkrj"] Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.424069 4717 scope.go:117] "RemoveContainer" containerID="077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfs85\" (UniqueName: \"kubernetes.io/projected/f002385f-3e66-4ced-a9e6-b2f056fb9053-kube-api-access-tfs85\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459343 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459391 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-config\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459825 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-svc\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.459978 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.460136 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.465826 4717 scope.go:117] "RemoveContainer" containerID="59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5" Feb 21 22:06:20 crc kubenswrapper[4717]: E0221 22:06:20.466134 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5\": container with ID starting with 59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5 not found: ID does not exist" containerID="59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.466157 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5"} err="failed to get container status \"59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5\": rpc error: code = NotFound desc = could not find container \"59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5\": container with ID starting with 59fbc8b4fbe404e5c2339d9ed4d1c06437183c12e4a79f8b84f33d59b3f68ce5 not found: ID does not exist" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.466175 4717 scope.go:117] "RemoveContainer" containerID="077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486" Feb 21 22:06:20 crc kubenswrapper[4717]: E0221 22:06:20.468356 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486\": container with ID starting with 077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486 not found: ID does not exist" containerID="077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.468378 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486"} err="failed to get container status \"077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486\": rpc error: code = NotFound desc = could not find container \"077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486\": container with ID starting with 077026fc3358addb5baa8f087ccbe44cd9c25bb57ac6177774c2be4302445486 not found: ID does not exist" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfs85\" (UniqueName: \"kubernetes.io/projected/f002385f-3e66-4ced-a9e6-b2f056fb9053-kube-api-access-tfs85\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562661 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-config\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-svc\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.562793 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.563597 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.564906 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.565103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-config\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.565567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.565571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.566027 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-svc\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.578450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfs85\" (UniqueName: \"kubernetes.io/projected/f002385f-3e66-4ced-a9e6-b2f056fb9053-kube-api-access-tfs85\") pod \"dnsmasq-dns-67b789f86c-blkrj\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.698811 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.710226 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.730509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.746628 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.748522 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.761250 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.761501 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.761643 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dldpr" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.761783 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.790628 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.811176 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.811562 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.811760 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871190 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7752p\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-kube-api-access-7752p\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871308 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ce8ac2-d776-449b-89d8-3e9a853a8f44-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871327 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871382 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ce8ac2-d776-449b-89d8-3e9a853a8f44-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871406 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871432 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871484 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871509 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-config-data\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.871537 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.972870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ce8ac2-d776-449b-89d8-3e9a853a8f44-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.972932 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.972962 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.972987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973007 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973025 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-config-data\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973052 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973068 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7752p\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-kube-api-access-7752p\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973108 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ce8ac2-d776-449b-89d8-3e9a853a8f44-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.973156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.974120 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-config-data\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.974271 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.974833 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.975125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.975328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.976085 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ce8ac2-d776-449b-89d8-3e9a853a8f44-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.976492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.976892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ce8ac2-d776-449b-89d8-3e9a853a8f44-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.985425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ce8ac2-d776-449b-89d8-3e9a853a8f44-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.985547 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:20 crc kubenswrapper[4717]: I0221 22:06:20.996206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7752p\" (UniqueName: \"kubernetes.io/projected/38ce8ac2-d776-449b-89d8-3e9a853a8f44-kube-api-access-7752p\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.019226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"38ce8ac2-d776-449b-89d8-3e9a853a8f44\") " pod="openstack/rabbitmq-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.158076 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.236662 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.360136 4717 generic.go:334] "Generic (PLEG): container finished" podID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerID="25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b" exitCode=0 Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.360319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2400f71f-f7db-4ed8-83aa-8427afd4dcd5","Type":"ContainerDied","Data":"25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b"} Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.360391 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.360467 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2400f71f-f7db-4ed8-83aa-8427afd4dcd5","Type":"ContainerDied","Data":"7a499f94393ed90369dc531875aa72cbf0408202b6091dde870184a9ffdb964b"} Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.360513 4717 scope.go:117] "RemoveContainer" containerID="25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390004 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-plugins-conf\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390053 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-server-conf\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390095 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-plugins\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njnjr\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-kube-api-access-njnjr\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390163 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-confd\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390187 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-config-data\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-pod-info\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-tls\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-erlang-cookie\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390332 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-erlang-cookie-secret\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.390386 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\" (UID: \"2400f71f-f7db-4ed8-83aa-8427afd4dcd5\") " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.391679 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.392336 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.392715 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.401869 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.401927 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.411030 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.411076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-kube-api-access-njnjr" (OuterVolumeSpecName: "kube-api-access-njnjr") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "kube-api-access-njnjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.416491 4717 scope.go:117] "RemoveContainer" containerID="eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.417594 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-pod-info" (OuterVolumeSpecName: "pod-info") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.440349 4717 scope.go:117] "RemoveContainer" containerID="25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b" Feb 21 22:06:21 crc kubenswrapper[4717]: E0221 22:06:21.441317 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b\": container with ID starting with 25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b not found: ID does not exist" containerID="25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.441360 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b"} err="failed to get container status \"25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b\": rpc error: code = NotFound desc = could not find container \"25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b\": container with ID starting with 25df727068804aa0ec572e9e107f336bdd2b5d7c0bcfdc22ac8c108d4ba5c34b not found: ID does not exist" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.441393 4717 scope.go:117] "RemoveContainer" containerID="eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1" Feb 21 22:06:21 crc kubenswrapper[4717]: E0221 22:06:21.442188 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1\": container with ID starting with eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1 not found: ID does not exist" containerID="eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.442214 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1"} err="failed to get container status \"eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1\": rpc error: code = NotFound desc = could not find container \"eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1\": container with ID starting with eb8bafce1f801194c335304b8c1230af7ed56a9b5b62a58262e456a8cd3064b1 not found: ID does not exist" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.447446 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-blkrj"] Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.467636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-config-data" (OuterVolumeSpecName: "config-data") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492218 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-pod-info\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492385 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492439 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492487 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492584 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492637 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492684 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492730 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njnjr\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-kube-api-access-njnjr\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.492776 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.494969 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-server-conf" (OuterVolumeSpecName: "server-conf") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.523475 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.570411 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2400f71f-f7db-4ed8-83aa-8427afd4dcd5" (UID: "2400f71f-f7db-4ed8-83aa-8427afd4dcd5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.594382 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.594440 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-server-conf\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.594454 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2400f71f-f7db-4ed8-83aa-8427afd4dcd5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:21 crc kubenswrapper[4717]: W0221 22:06:21.680320 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ce8ac2_d776_449b_89d8_3e9a853a8f44.slice/crio-94d2b1a8168063598ad450e7118db5deb632b50ef043d6b5f29fa7a2587989ca WatchSource:0}: Error finding container 94d2b1a8168063598ad450e7118db5deb632b50ef043d6b5f29fa7a2587989ca: Status 404 returned error can't find the container with id 94d2b1a8168063598ad450e7118db5deb632b50ef043d6b5f29fa7a2587989ca Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.694734 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.704044 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.715536 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.725085 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:06:21 crc kubenswrapper[4717]: E0221 22:06:21.725543 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerName="setup-container" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.725560 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerName="setup-container" Feb 21 22:06:21 crc kubenswrapper[4717]: E0221 22:06:21.725584 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerName="rabbitmq" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.725591 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerName="rabbitmq" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.725777 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" containerName="rabbitmq" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.726799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.732567 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.732667 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.732715 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-swqhk" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.732848 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.732993 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.733085 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.738102 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.756911 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.898912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5547796-04e1-40e3-aa4a-a1aa936efcda-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.898975 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899011 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvzb\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-kube-api-access-wnvzb\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899548 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899676 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.899775 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5547796-04e1-40e3-aa4a-a1aa936efcda-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.991477 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2400f71f-f7db-4ed8-83aa-8427afd4dcd5" path="/var/lib/kubelet/pods/2400f71f-f7db-4ed8-83aa-8427afd4dcd5/volumes" Feb 21 22:06:21 crc kubenswrapper[4717]: I0221 22:06:21.992462 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c030d9bf-a8c2-4dc0-996b-82ed1214d4bd" path="/var/lib/kubelet/pods/c030d9bf-a8c2-4dc0-996b-82ed1214d4bd/volumes" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001188 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5547796-04e1-40e3-aa4a-a1aa936efcda-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5547796-04e1-40e3-aa4a-a1aa936efcda-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001303 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001320 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvzb\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-kube-api-access-wnvzb\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001395 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.001415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.002258 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.002688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.003378 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.004130 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.004537 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.005234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5547796-04e1-40e3-aa4a-a1aa936efcda-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.007822 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5547796-04e1-40e3-aa4a-a1aa936efcda-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.008924 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.009241 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.011718 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5547796-04e1-40e3-aa4a-a1aa936efcda-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.020893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvzb\" (UniqueName: \"kubernetes.io/projected/b5547796-04e1-40e3-aa4a-a1aa936efcda-kube-api-access-wnvzb\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.036639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5547796-04e1-40e3-aa4a-a1aa936efcda\") " pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.148060 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.384411 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ce8ac2-d776-449b-89d8-3e9a853a8f44","Type":"ContainerStarted","Data":"94d2b1a8168063598ad450e7118db5deb632b50ef043d6b5f29fa7a2587989ca"} Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.386313 4717 generic.go:334] "Generic (PLEG): container finished" podID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerID="2edc4acfdf5ac4bab9bf5b5314800a199a484c6d86f2dc3e611e659b68174725" exitCode=0 Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.386365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" event={"ID":"f002385f-3e66-4ced-a9e6-b2f056fb9053","Type":"ContainerDied","Data":"2edc4acfdf5ac4bab9bf5b5314800a199a484c6d86f2dc3e611e659b68174725"} Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.386380 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" event={"ID":"f002385f-3e66-4ced-a9e6-b2f056fb9053","Type":"ContainerStarted","Data":"3d8b19125c3222d51914d1ec925ef41a1d8fcc354552c6cdcd4c0f497811a872"} Feb 21 22:06:22 crc kubenswrapper[4717]: I0221 22:06:22.606251 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 21 22:06:22 crc kubenswrapper[4717]: W0221 22:06:22.694077 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5547796_04e1_40e3_aa4a_a1aa936efcda.slice/crio-23955e35bc7709d1530b6ff1d3152ef82989917eba0fd08799eba2153e45626f WatchSource:0}: Error finding container 23955e35bc7709d1530b6ff1d3152ef82989917eba0fd08799eba2153e45626f: Status 404 returned error can't find the container with id 23955e35bc7709d1530b6ff1d3152ef82989917eba0fd08799eba2153e45626f Feb 21 22:06:23 crc kubenswrapper[4717]: I0221 22:06:23.399281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" event={"ID":"f002385f-3e66-4ced-a9e6-b2f056fb9053","Type":"ContainerStarted","Data":"929e69a9b7bd65b9c9ff24c0b2e02d425f2f28889df151451fb1810b84f76b75"} Feb 21 22:06:23 crc kubenswrapper[4717]: I0221 22:06:23.399628 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:23 crc kubenswrapper[4717]: I0221 22:06:23.402167 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ce8ac2-d776-449b-89d8-3e9a853a8f44","Type":"ContainerStarted","Data":"5b90a0fe8530aed03f95e8431d2b07f18f16c7b9f843de3db90705cbc0933337"} Feb 21 22:06:23 crc kubenswrapper[4717]: I0221 22:06:23.404144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5547796-04e1-40e3-aa4a-a1aa936efcda","Type":"ContainerStarted","Data":"23955e35bc7709d1530b6ff1d3152ef82989917eba0fd08799eba2153e45626f"} Feb 21 22:06:23 crc kubenswrapper[4717]: I0221 22:06:23.430669 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" podStartSLOduration=3.430643159 podStartE2EDuration="3.430643159s" podCreationTimestamp="2026-02-21 22:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:06:23.422487004 +0000 UTC m=+1198.204020636" watchObservedRunningTime="2026-02-21 22:06:23.430643159 +0000 UTC m=+1198.212176811" Feb 21 22:06:25 crc kubenswrapper[4717]: I0221 22:06:25.426845 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5547796-04e1-40e3-aa4a-a1aa936efcda","Type":"ContainerStarted","Data":"ad342b7fa122b1db94827e2b955527035a405f8d7cfe682a72f00555fdd181cc"} Feb 21 22:06:30 crc kubenswrapper[4717]: I0221 22:06:30.733099 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:30 crc kubenswrapper[4717]: I0221 22:06:30.811519 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-82jvl"] Feb 21 22:06:30 crc kubenswrapper[4717]: I0221 22:06:30.812348 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerName="dnsmasq-dns" containerID="cri-o://c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74" gracePeriod=10 Feb 21 22:06:30 crc kubenswrapper[4717]: I0221 22:06:30.979437 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-q56p6"] Feb 21 22:06:30 crc kubenswrapper[4717]: I0221 22:06:30.980967 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.012743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-q56p6"] Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.117675 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.117743 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.117770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.117827 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-config\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.117845 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.118028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.118179 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq68t\" (UniqueName: \"kubernetes.io/projected/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-kube-api-access-bq68t\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.219746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq68t\" (UniqueName: \"kubernetes.io/projected/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-kube-api-access-bq68t\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.220104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.220157 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.220180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.220222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-config\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.220240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.220286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.221220 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.221282 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.222147 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.222392 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.222749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-config\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.223091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.258538 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq68t\" (UniqueName: \"kubernetes.io/projected/ba43f982-ee7f-4e48-a144-0c6d5c54c5a1-kube-api-access-bq68t\") pod \"dnsmasq-dns-cb6ffcf87-q56p6\" (UID: \"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1\") " pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.343957 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.366191 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.423423 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-config\") pod \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.423487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-nb\") pod \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.423545 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcc5f\" (UniqueName: \"kubernetes.io/projected/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-kube-api-access-pcc5f\") pod \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.423655 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-svc\") pod \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.423678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-swift-storage-0\") pod \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.423738 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-sb\") pod \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\" (UID: \"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8\") " Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.429129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-kube-api-access-pcc5f" (OuterVolumeSpecName: "kube-api-access-pcc5f") pod "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" (UID: "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8"). InnerVolumeSpecName "kube-api-access-pcc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.481767 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" (UID: "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.481957 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-config" (OuterVolumeSpecName: "config") pod "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" (UID: "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.507211 4717 generic.go:334] "Generic (PLEG): container finished" podID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerID="c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74" exitCode=0 Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.507500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" event={"ID":"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8","Type":"ContainerDied","Data":"c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74"} Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.507524 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" event={"ID":"29abbd2c-eeb4-49e9-afd3-8947f3b50ba8","Type":"ContainerDied","Data":"7ebfd5a7e70aa2b41da406478a2f88f1478b8bc92937e5eecb8214a54b263dbd"} Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.507539 4717 scope.go:117] "RemoveContainer" containerID="c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.507692 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-82jvl" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.516418 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" (UID: "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.527572 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcc5f\" (UniqueName: \"kubernetes.io/projected/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-kube-api-access-pcc5f\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.527609 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.527622 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.527631 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.528398 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" (UID: "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.529426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" (UID: "29abbd2c-eeb4-49e9-afd3-8947f3b50ba8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.530915 4717 scope.go:117] "RemoveContainer" containerID="88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.556967 4717 scope.go:117] "RemoveContainer" containerID="c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74" Feb 21 22:06:31 crc kubenswrapper[4717]: E0221 22:06:31.561553 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74\": container with ID starting with c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74 not found: ID does not exist" containerID="c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.561591 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74"} err="failed to get container status \"c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74\": rpc error: code = NotFound desc = could not find container \"c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74\": container with ID starting with c3759846fd9fb78a9f4a5a957b9136d54279f09b37c80dbb36da8e0a1a680b74 not found: ID does not exist" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.561618 4717 scope.go:117] "RemoveContainer" containerID="88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60" Feb 21 22:06:31 crc kubenswrapper[4717]: E0221 22:06:31.562067 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60\": container with ID starting with 88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60 not found: ID does not exist" containerID="88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.562086 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60"} err="failed to get container status \"88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60\": rpc error: code = NotFound desc = could not find container \"88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60\": container with ID starting with 88cf53c5b5a67caaf62ff548cc2b3d4e3a8774262ac9a58237f434008fa28f60 not found: ID does not exist" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.633129 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.633167 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.843143 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-82jvl"] Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.852342 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-82jvl"] Feb 21 22:06:31 crc kubenswrapper[4717]: I0221 22:06:31.876610 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-q56p6"] Feb 21 22:06:32 crc kubenswrapper[4717]: I0221 22:06:32.001218 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" path="/var/lib/kubelet/pods/29abbd2c-eeb4-49e9-afd3-8947f3b50ba8/volumes" Feb 21 22:06:32 crc kubenswrapper[4717]: I0221 22:06:32.528625 4717 generic.go:334] "Generic (PLEG): container finished" podID="ba43f982-ee7f-4e48-a144-0c6d5c54c5a1" containerID="9653196138d771fec6e173590882585edc41574f12d73c9dd548153032a27b72" exitCode=0 Feb 21 22:06:32 crc kubenswrapper[4717]: I0221 22:06:32.528698 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" event={"ID":"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1","Type":"ContainerDied","Data":"9653196138d771fec6e173590882585edc41574f12d73c9dd548153032a27b72"} Feb 21 22:06:32 crc kubenswrapper[4717]: I0221 22:06:32.528740 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" event={"ID":"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1","Type":"ContainerStarted","Data":"5012acb18e05f0138269071f63d56ad6e2d467025aa19304a6ca1f299432e8d8"} Feb 21 22:06:33 crc kubenswrapper[4717]: I0221 22:06:33.537938 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" event={"ID":"ba43f982-ee7f-4e48-a144-0c6d5c54c5a1","Type":"ContainerStarted","Data":"07ddbe47599104df56ec79484a78e3a9c190c969407fa7ad3bb702494819c5a5"} Feb 21 22:06:33 crc kubenswrapper[4717]: I0221 22:06:33.538369 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:33 crc kubenswrapper[4717]: I0221 22:06:33.568392 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" podStartSLOduration=3.568378725 podStartE2EDuration="3.568378725s" podCreationTimestamp="2026-02-21 22:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:06:33.566106222 +0000 UTC m=+1208.347639844" watchObservedRunningTime="2026-02-21 22:06:33.568378725 +0000 UTC m=+1208.349912347" Feb 21 22:06:41 crc kubenswrapper[4717]: I0221 22:06:41.368172 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-q56p6" Feb 21 22:06:41 crc kubenswrapper[4717]: I0221 22:06:41.473571 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-blkrj"] Feb 21 22:06:41 crc kubenswrapper[4717]: I0221 22:06:41.473830 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerName="dnsmasq-dns" containerID="cri-o://929e69a9b7bd65b9c9ff24c0b2e02d425f2f28889df151451fb1810b84f76b75" gracePeriod=10 Feb 21 22:06:41 crc kubenswrapper[4717]: I0221 22:06:41.623718 4717 generic.go:334] "Generic (PLEG): container finished" podID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerID="929e69a9b7bd65b9c9ff24c0b2e02d425f2f28889df151451fb1810b84f76b75" exitCode=0 Feb 21 22:06:41 crc kubenswrapper[4717]: I0221 22:06:41.623761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" event={"ID":"f002385f-3e66-4ced-a9e6-b2f056fb9053","Type":"ContainerDied","Data":"929e69a9b7bd65b9c9ff24c0b2e02d425f2f28889df151451fb1810b84f76b75"} Feb 21 22:06:41 crc kubenswrapper[4717]: I0221 22:06:41.932681 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067302 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-nb\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067377 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-config\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067402 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-sb\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067426 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-svc\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067475 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-swift-storage-0\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067551 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfs85\" (UniqueName: \"kubernetes.io/projected/f002385f-3e66-4ced-a9e6-b2f056fb9053-kube-api-access-tfs85\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.067603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-openstack-edpm-ipam\") pod \"f002385f-3e66-4ced-a9e6-b2f056fb9053\" (UID: \"f002385f-3e66-4ced-a9e6-b2f056fb9053\") " Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.075142 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f002385f-3e66-4ced-a9e6-b2f056fb9053-kube-api-access-tfs85" (OuterVolumeSpecName: "kube-api-access-tfs85") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "kube-api-access-tfs85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.120339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-config" (OuterVolumeSpecName: "config") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.123180 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.131495 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.135062 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.139035 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.143324 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f002385f-3e66-4ced-a9e6-b2f056fb9053" (UID: "f002385f-3e66-4ced-a9e6-b2f056fb9053"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169848 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169907 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169920 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169931 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169942 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169954 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfs85\" (UniqueName: \"kubernetes.io/projected/f002385f-3e66-4ced-a9e6-b2f056fb9053-kube-api-access-tfs85\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.169968 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f002385f-3e66-4ced-a9e6-b2f056fb9053-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.632037 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" event={"ID":"f002385f-3e66-4ced-a9e6-b2f056fb9053","Type":"ContainerDied","Data":"3d8b19125c3222d51914d1ec925ef41a1d8fcc354552c6cdcd4c0f497811a872"} Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.632101 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-blkrj" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.632372 4717 scope.go:117] "RemoveContainer" containerID="929e69a9b7bd65b9c9ff24c0b2e02d425f2f28889df151451fb1810b84f76b75" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.654083 4717 scope.go:117] "RemoveContainer" containerID="2edc4acfdf5ac4bab9bf5b5314800a199a484c6d86f2dc3e611e659b68174725" Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.666407 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-blkrj"] Feb 21 22:06:42 crc kubenswrapper[4717]: I0221 22:06:42.675067 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-blkrj"] Feb 21 22:06:43 crc kubenswrapper[4717]: I0221 22:06:43.993916 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" path="/var/lib/kubelet/pods/f002385f-3e66-4ced-a9e6-b2f056fb9053/volumes" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.357263 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg"] Feb 21 22:06:54 crc kubenswrapper[4717]: E0221 22:06:54.358226 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerName="init" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.358240 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerName="init" Feb 21 22:06:54 crc kubenswrapper[4717]: E0221 22:06:54.358257 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerName="dnsmasq-dns" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.358266 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerName="dnsmasq-dns" Feb 21 22:06:54 crc kubenswrapper[4717]: E0221 22:06:54.358276 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerName="init" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.358284 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerName="init" Feb 21 22:06:54 crc kubenswrapper[4717]: E0221 22:06:54.358323 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerName="dnsmasq-dns" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.358331 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerName="dnsmasq-dns" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.358548 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f002385f-3e66-4ced-a9e6-b2f056fb9053" containerName="dnsmasq-dns" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.358566 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="29abbd2c-eeb4-49e9-afd3-8947f3b50ba8" containerName="dnsmasq-dns" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.359255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.363621 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.363688 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.363627 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.363929 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.369672 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg"] Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.465992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.466403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.466534 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.466577 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7kg\" (UniqueName: \"kubernetes.io/projected/331acbed-5028-4fdb-84a4-b105805863b9-kube-api-access-5x7kg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.568295 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7kg\" (UniqueName: \"kubernetes.io/projected/331acbed-5028-4fdb-84a4-b105805863b9-kube-api-access-5x7kg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.568477 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.568582 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.568709 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.574472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.575923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.578951 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.600929 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7kg\" (UniqueName: \"kubernetes.io/projected/331acbed-5028-4fdb-84a4-b105805863b9-kube-api-access-5x7kg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:54 crc kubenswrapper[4717]: I0221 22:06:54.681061 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:06:55 crc kubenswrapper[4717]: I0221 22:06:55.281012 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg"] Feb 21 22:06:55 crc kubenswrapper[4717]: W0221 22:06:55.290542 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod331acbed_5028_4fdb_84a4_b105805863b9.slice/crio-9ba104318facd4c158ff07e57486007694a1e2b041980fa5961123d967b0162a WatchSource:0}: Error finding container 9ba104318facd4c158ff07e57486007694a1e2b041980fa5961123d967b0162a: Status 404 returned error can't find the container with id 9ba104318facd4c158ff07e57486007694a1e2b041980fa5961123d967b0162a Feb 21 22:06:55 crc kubenswrapper[4717]: I0221 22:06:55.294604 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:06:55 crc kubenswrapper[4717]: I0221 22:06:55.768131 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" event={"ID":"331acbed-5028-4fdb-84a4-b105805863b9","Type":"ContainerStarted","Data":"9ba104318facd4c158ff07e57486007694a1e2b041980fa5961123d967b0162a"} Feb 21 22:06:57 crc kubenswrapper[4717]: I0221 22:06:57.110263 4717 generic.go:334] "Generic (PLEG): container finished" podID="38ce8ac2-d776-449b-89d8-3e9a853a8f44" containerID="5b90a0fe8530aed03f95e8431d2b07f18f16c7b9f843de3db90705cbc0933337" exitCode=0 Feb 21 22:06:57 crc kubenswrapper[4717]: I0221 22:06:57.110326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ce8ac2-d776-449b-89d8-3e9a853a8f44","Type":"ContainerDied","Data":"5b90a0fe8530aed03f95e8431d2b07f18f16c7b9f843de3db90705cbc0933337"} Feb 21 22:06:58 crc kubenswrapper[4717]: I0221 22:06:58.119949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38ce8ac2-d776-449b-89d8-3e9a853a8f44","Type":"ContainerStarted","Data":"f411c9f929fe995100d1073f401c21c90bba1a47c6b91245a3254fa6a2afb69d"} Feb 21 22:06:58 crc kubenswrapper[4717]: I0221 22:06:58.120612 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 21 22:06:58 crc kubenswrapper[4717]: I0221 22:06:58.139791 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.139776387 podStartE2EDuration="38.139776387s" podCreationTimestamp="2026-02-21 22:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:06:58.137954613 +0000 UTC m=+1232.919488265" watchObservedRunningTime="2026-02-21 22:06:58.139776387 +0000 UTC m=+1232.921309999" Feb 21 22:06:59 crc kubenswrapper[4717]: I0221 22:06:59.133002 4717 generic.go:334] "Generic (PLEG): container finished" podID="b5547796-04e1-40e3-aa4a-a1aa936efcda" containerID="ad342b7fa122b1db94827e2b955527035a405f8d7cfe682a72f00555fdd181cc" exitCode=0 Feb 21 22:06:59 crc kubenswrapper[4717]: I0221 22:06:59.133037 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5547796-04e1-40e3-aa4a-a1aa936efcda","Type":"ContainerDied","Data":"ad342b7fa122b1db94827e2b955527035a405f8d7cfe682a72f00555fdd181cc"} Feb 21 22:07:05 crc kubenswrapper[4717]: I0221 22:07:05.203733 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" event={"ID":"331acbed-5028-4fdb-84a4-b105805863b9","Type":"ContainerStarted","Data":"f050a13d48f244ecb7c609aeb2781dae337dcaf567df37592925b733095580c9"} Feb 21 22:07:05 crc kubenswrapper[4717]: I0221 22:07:05.206199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5547796-04e1-40e3-aa4a-a1aa936efcda","Type":"ContainerStarted","Data":"ca1360723f9abc83bed0118ade3d1d8765a0d2cda79b2401bdfaf04e025c634c"} Feb 21 22:07:05 crc kubenswrapper[4717]: I0221 22:07:05.206515 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:07:05 crc kubenswrapper[4717]: I0221 22:07:05.235268 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" podStartSLOduration=2.12766194 podStartE2EDuration="11.235242397s" podCreationTimestamp="2026-02-21 22:06:54 +0000 UTC" firstStartedPulling="2026-02-21 22:06:55.293992459 +0000 UTC m=+1230.075526121" lastFinishedPulling="2026-02-21 22:07:04.401572956 +0000 UTC m=+1239.183106578" observedRunningTime="2026-02-21 22:07:05.227920282 +0000 UTC m=+1240.009453944" watchObservedRunningTime="2026-02-21 22:07:05.235242397 +0000 UTC m=+1240.016776059" Feb 21 22:07:05 crc kubenswrapper[4717]: I0221 22:07:05.260790 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.260760048 podStartE2EDuration="44.260760048s" podCreationTimestamp="2026-02-21 22:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:07:05.260366818 +0000 UTC m=+1240.041900470" watchObservedRunningTime="2026-02-21 22:07:05.260760048 +0000 UTC m=+1240.042293710" Feb 21 22:07:09 crc kubenswrapper[4717]: I0221 22:07:09.062662 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:07:09 crc kubenswrapper[4717]: I0221 22:07:09.064123 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:07:11 crc kubenswrapper[4717]: I0221 22:07:11.163036 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 21 22:07:16 crc kubenswrapper[4717]: I0221 22:07:16.323203 4717 generic.go:334] "Generic (PLEG): container finished" podID="331acbed-5028-4fdb-84a4-b105805863b9" containerID="f050a13d48f244ecb7c609aeb2781dae337dcaf567df37592925b733095580c9" exitCode=0 Feb 21 22:07:16 crc kubenswrapper[4717]: I0221 22:07:16.323371 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" event={"ID":"331acbed-5028-4fdb-84a4-b105805863b9","Type":"ContainerDied","Data":"f050a13d48f244ecb7c609aeb2781dae337dcaf567df37592925b733095580c9"} Feb 21 22:07:17 crc kubenswrapper[4717]: I0221 22:07:17.819809 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.007174 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-ssh-key-openstack-edpm-ipam\") pod \"331acbed-5028-4fdb-84a4-b105805863b9\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.007494 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-repo-setup-combined-ca-bundle\") pod \"331acbed-5028-4fdb-84a4-b105805863b9\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.007749 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-inventory\") pod \"331acbed-5028-4fdb-84a4-b105805863b9\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.008029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x7kg\" (UniqueName: \"kubernetes.io/projected/331acbed-5028-4fdb-84a4-b105805863b9-kube-api-access-5x7kg\") pod \"331acbed-5028-4fdb-84a4-b105805863b9\" (UID: \"331acbed-5028-4fdb-84a4-b105805863b9\") " Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.017103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "331acbed-5028-4fdb-84a4-b105805863b9" (UID: "331acbed-5028-4fdb-84a4-b105805863b9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.017125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331acbed-5028-4fdb-84a4-b105805863b9-kube-api-access-5x7kg" (OuterVolumeSpecName: "kube-api-access-5x7kg") pod "331acbed-5028-4fdb-84a4-b105805863b9" (UID: "331acbed-5028-4fdb-84a4-b105805863b9"). InnerVolumeSpecName "kube-api-access-5x7kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.057734 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-inventory" (OuterVolumeSpecName: "inventory") pod "331acbed-5028-4fdb-84a4-b105805863b9" (UID: "331acbed-5028-4fdb-84a4-b105805863b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.061085 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "331acbed-5028-4fdb-84a4-b105805863b9" (UID: "331acbed-5028-4fdb-84a4-b105805863b9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.111939 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.111997 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.112030 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x7kg\" (UniqueName: \"kubernetes.io/projected/331acbed-5028-4fdb-84a4-b105805863b9-kube-api-access-5x7kg\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.112057 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/331acbed-5028-4fdb-84a4-b105805863b9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.354193 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" event={"ID":"331acbed-5028-4fdb-84a4-b105805863b9","Type":"ContainerDied","Data":"9ba104318facd4c158ff07e57486007694a1e2b041980fa5961123d967b0162a"} Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.354559 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba104318facd4c158ff07e57486007694a1e2b041980fa5961123d967b0162a" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.354732 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.440721 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt"] Feb 21 22:07:18 crc kubenswrapper[4717]: E0221 22:07:18.441251 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331acbed-5028-4fdb-84a4-b105805863b9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.441275 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="331acbed-5028-4fdb-84a4-b105805863b9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.441519 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="331acbed-5028-4fdb-84a4-b105805863b9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.442332 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.445817 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.446113 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.447557 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.449087 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.469736 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt"] Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.530156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.530719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz6jj\" (UniqueName: \"kubernetes.io/projected/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-kube-api-access-vz6jj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.530952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.633276 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.633342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz6jj\" (UniqueName: \"kubernetes.io/projected/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-kube-api-access-vz6jj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.633372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.645759 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.646249 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.650142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz6jj\" (UniqueName: \"kubernetes.io/projected/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-kube-api-access-vz6jj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tqqzt\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:18 crc kubenswrapper[4717]: I0221 22:07:18.765293 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:19 crc kubenswrapper[4717]: I0221 22:07:19.124176 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt"] Feb 21 22:07:19 crc kubenswrapper[4717]: I0221 22:07:19.372602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" event={"ID":"49b31634-aebd-4b3a-a1a3-f7d3e06782cf","Type":"ContainerStarted","Data":"5dd0a3aacd29133d96aa420413f52f98fd5b36eae7abd90c8e5da6542711f83d"} Feb 21 22:07:20 crc kubenswrapper[4717]: I0221 22:07:20.389519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" event={"ID":"49b31634-aebd-4b3a-a1a3-f7d3e06782cf","Type":"ContainerStarted","Data":"c039e953ab2b216f16a182f262f090a0cb4b024b687d2f390f70155fed72533b"} Feb 21 22:07:20 crc kubenswrapper[4717]: I0221 22:07:20.429525 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" podStartSLOduration=2.036155602 podStartE2EDuration="2.429495736s" podCreationTimestamp="2026-02-21 22:07:18 +0000 UTC" firstStartedPulling="2026-02-21 22:07:19.127381685 +0000 UTC m=+1253.908915307" lastFinishedPulling="2026-02-21 22:07:19.520721789 +0000 UTC m=+1254.302255441" observedRunningTime="2026-02-21 22:07:20.413459953 +0000 UTC m=+1255.194993615" watchObservedRunningTime="2026-02-21 22:07:20.429495736 +0000 UTC m=+1255.211029398" Feb 21 22:07:22 crc kubenswrapper[4717]: I0221 22:07:22.151107 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 21 22:07:22 crc kubenswrapper[4717]: I0221 22:07:22.410113 4717 generic.go:334] "Generic (PLEG): container finished" podID="49b31634-aebd-4b3a-a1a3-f7d3e06782cf" containerID="c039e953ab2b216f16a182f262f090a0cb4b024b687d2f390f70155fed72533b" exitCode=0 Feb 21 22:07:22 crc kubenswrapper[4717]: I0221 22:07:22.410160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" event={"ID":"49b31634-aebd-4b3a-a1a3-f7d3e06782cf","Type":"ContainerDied","Data":"c039e953ab2b216f16a182f262f090a0cb4b024b687d2f390f70155fed72533b"} Feb 21 22:07:23 crc kubenswrapper[4717]: I0221 22:07:23.935139 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:23 crc kubenswrapper[4717]: I0221 22:07:23.960465 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz6jj\" (UniqueName: \"kubernetes.io/projected/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-kube-api-access-vz6jj\") pod \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " Feb 21 22:07:23 crc kubenswrapper[4717]: I0221 22:07:23.960558 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-ssh-key-openstack-edpm-ipam\") pod \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " Feb 21 22:07:23 crc kubenswrapper[4717]: I0221 22:07:23.960650 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-inventory\") pod \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\" (UID: \"49b31634-aebd-4b3a-a1a3-f7d3e06782cf\") " Feb 21 22:07:23 crc kubenswrapper[4717]: I0221 22:07:23.966664 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-kube-api-access-vz6jj" (OuterVolumeSpecName: "kube-api-access-vz6jj") pod "49b31634-aebd-4b3a-a1a3-f7d3e06782cf" (UID: "49b31634-aebd-4b3a-a1a3-f7d3e06782cf"). InnerVolumeSpecName "kube-api-access-vz6jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:07:23 crc kubenswrapper[4717]: I0221 22:07:23.988852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "49b31634-aebd-4b3a-a1a3-f7d3e06782cf" (UID: "49b31634-aebd-4b3a-a1a3-f7d3e06782cf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.027103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-inventory" (OuterVolumeSpecName: "inventory") pod "49b31634-aebd-4b3a-a1a3-f7d3e06782cf" (UID: "49b31634-aebd-4b3a-a1a3-f7d3e06782cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.063059 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.063101 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.063114 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz6jj\" (UniqueName: \"kubernetes.io/projected/49b31634-aebd-4b3a-a1a3-f7d3e06782cf-kube-api-access-vz6jj\") on node \"crc\" DevicePath \"\"" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.432221 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" event={"ID":"49b31634-aebd-4b3a-a1a3-f7d3e06782cf","Type":"ContainerDied","Data":"5dd0a3aacd29133d96aa420413f52f98fd5b36eae7abd90c8e5da6542711f83d"} Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.432267 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd0a3aacd29133d96aa420413f52f98fd5b36eae7abd90c8e5da6542711f83d" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.432517 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tqqzt" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.503318 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55"] Feb 21 22:07:24 crc kubenswrapper[4717]: E0221 22:07:24.503683 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b31634-aebd-4b3a-a1a3-f7d3e06782cf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.503700 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b31634-aebd-4b3a-a1a3-f7d3e06782cf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.503899 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b31634-aebd-4b3a-a1a3-f7d3e06782cf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.504465 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.510310 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.510447 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.510511 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.510553 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.523354 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55"] Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.571580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.571639 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.571666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjkml\" (UniqueName: \"kubernetes.io/projected/da6e1269-a5c6-4f39-8d0a-b544de9522ba-kube-api-access-kjkml\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.571721 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.673788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.674177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjkml\" (UniqueName: \"kubernetes.io/projected/da6e1269-a5c6-4f39-8d0a-b544de9522ba-kube-api-access-kjkml\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.674458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.674897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.679066 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.679977 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.680426 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.694459 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjkml\" (UniqueName: \"kubernetes.io/projected/da6e1269-a5c6-4f39-8d0a-b544de9522ba-kube-api-access-kjkml\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-twq55\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:24 crc kubenswrapper[4717]: I0221 22:07:24.822162 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:07:25 crc kubenswrapper[4717]: I0221 22:07:25.401638 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55"] Feb 21 22:07:25 crc kubenswrapper[4717]: W0221 22:07:25.404243 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda6e1269_a5c6_4f39_8d0a_b544de9522ba.slice/crio-1e6ba363e7522286cde8bccfe89107870e71e92e90059fb25e647e713b9c8db9 WatchSource:0}: Error finding container 1e6ba363e7522286cde8bccfe89107870e71e92e90059fb25e647e713b9c8db9: Status 404 returned error can't find the container with id 1e6ba363e7522286cde8bccfe89107870e71e92e90059fb25e647e713b9c8db9 Feb 21 22:07:25 crc kubenswrapper[4717]: I0221 22:07:25.452129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" event={"ID":"da6e1269-a5c6-4f39-8d0a-b544de9522ba","Type":"ContainerStarted","Data":"1e6ba363e7522286cde8bccfe89107870e71e92e90059fb25e647e713b9c8db9"} Feb 21 22:07:26 crc kubenswrapper[4717]: I0221 22:07:26.467161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" event={"ID":"da6e1269-a5c6-4f39-8d0a-b544de9522ba","Type":"ContainerStarted","Data":"b59dc34051009c787f90ab179eba45cb21d0e0d94acba9366fe927fe2dea50b8"} Feb 21 22:07:26 crc kubenswrapper[4717]: I0221 22:07:26.505218 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" podStartSLOduration=2.077820209 podStartE2EDuration="2.505184226s" podCreationTimestamp="2026-02-21 22:07:24 +0000 UTC" firstStartedPulling="2026-02-21 22:07:25.409059669 +0000 UTC m=+1260.190593341" lastFinishedPulling="2026-02-21 22:07:25.836423696 +0000 UTC m=+1260.617957358" observedRunningTime="2026-02-21 22:07:26.489156752 +0000 UTC m=+1261.270690414" watchObservedRunningTime="2026-02-21 22:07:26.505184226 +0000 UTC m=+1261.286717888" Feb 21 22:07:39 crc kubenswrapper[4717]: I0221 22:07:39.063159 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:07:39 crc kubenswrapper[4717]: I0221 22:07:39.063956 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:08:09 crc kubenswrapper[4717]: I0221 22:08:09.062605 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:08:09 crc kubenswrapper[4717]: I0221 22:08:09.063128 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:08:09 crc kubenswrapper[4717]: I0221 22:08:09.063176 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:08:09 crc kubenswrapper[4717]: I0221 22:08:09.063982 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1519f9b66326a62378129e107d074b9de1f98d8964cf8073c190668b31d38eb"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:08:09 crc kubenswrapper[4717]: I0221 22:08:09.064053 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://d1519f9b66326a62378129e107d074b9de1f98d8964cf8073c190668b31d38eb" gracePeriod=600 Feb 21 22:08:10 crc kubenswrapper[4717]: I0221 22:08:10.006441 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="d1519f9b66326a62378129e107d074b9de1f98d8964cf8073c190668b31d38eb" exitCode=0 Feb 21 22:08:10 crc kubenswrapper[4717]: I0221 22:08:10.011907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"d1519f9b66326a62378129e107d074b9de1f98d8964cf8073c190668b31d38eb"} Feb 21 22:08:10 crc kubenswrapper[4717]: I0221 22:08:10.011956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d"} Feb 21 22:08:10 crc kubenswrapper[4717]: I0221 22:08:10.011975 4717 scope.go:117] "RemoveContainer" containerID="d6ea5ddcf698b572f76b6bdcff7985d26f0ef62fef8084d68925d625b747dd34" Feb 21 22:08:27 crc kubenswrapper[4717]: I0221 22:08:27.643338 4717 scope.go:117] "RemoveContainer" containerID="78ed7a4e15a63c17670c1a65ac66fd9b7cf834115de5ac9b2ebdf64a6319cfb5" Feb 21 22:08:27 crc kubenswrapper[4717]: I0221 22:08:27.685707 4717 scope.go:117] "RemoveContainer" containerID="73faedea9a78734027ca32d988a5bd77149a3417bc03ba797c60133876399194" Feb 21 22:08:27 crc kubenswrapper[4717]: I0221 22:08:27.757168 4717 scope.go:117] "RemoveContainer" containerID="79ee48e189a50ffdf5b61c9e64ea24fcde170e6bce71673b585b23d8eb26926a" Feb 21 22:09:27 crc kubenswrapper[4717]: I0221 22:09:27.911423 4717 scope.go:117] "RemoveContainer" containerID="fc5d9bc25940a3a0e67e4bd13fbf937a13b921f9b8084b93ebc9d58c63b41247" Feb 21 22:10:09 crc kubenswrapper[4717]: I0221 22:10:09.062501 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:10:09 crc kubenswrapper[4717]: I0221 22:10:09.063397 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:10:32 crc kubenswrapper[4717]: I0221 22:10:32.728847 4717 generic.go:334] "Generic (PLEG): container finished" podID="da6e1269-a5c6-4f39-8d0a-b544de9522ba" containerID="b59dc34051009c787f90ab179eba45cb21d0e0d94acba9366fe927fe2dea50b8" exitCode=0 Feb 21 22:10:32 crc kubenswrapper[4717]: I0221 22:10:32.728920 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" event={"ID":"da6e1269-a5c6-4f39-8d0a-b544de9522ba","Type":"ContainerDied","Data":"b59dc34051009c787f90ab179eba45cb21d0e0d94acba9366fe927fe2dea50b8"} Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.230380 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.303215 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-inventory\") pod \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.303308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjkml\" (UniqueName: \"kubernetes.io/projected/da6e1269-a5c6-4f39-8d0a-b544de9522ba-kube-api-access-kjkml\") pod \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.303348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-bootstrap-combined-ca-bundle\") pod \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.303450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-ssh-key-openstack-edpm-ipam\") pod \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\" (UID: \"da6e1269-a5c6-4f39-8d0a-b544de9522ba\") " Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.309958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "da6e1269-a5c6-4f39-8d0a-b544de9522ba" (UID: "da6e1269-a5c6-4f39-8d0a-b544de9522ba"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.322576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6e1269-a5c6-4f39-8d0a-b544de9522ba-kube-api-access-kjkml" (OuterVolumeSpecName: "kube-api-access-kjkml") pod "da6e1269-a5c6-4f39-8d0a-b544de9522ba" (UID: "da6e1269-a5c6-4f39-8d0a-b544de9522ba"). InnerVolumeSpecName "kube-api-access-kjkml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.341965 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "da6e1269-a5c6-4f39-8d0a-b544de9522ba" (UID: "da6e1269-a5c6-4f39-8d0a-b544de9522ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.344378 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-inventory" (OuterVolumeSpecName: "inventory") pod "da6e1269-a5c6-4f39-8d0a-b544de9522ba" (UID: "da6e1269-a5c6-4f39-8d0a-b544de9522ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.405393 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.405421 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjkml\" (UniqueName: \"kubernetes.io/projected/da6e1269-a5c6-4f39-8d0a-b544de9522ba-kube-api-access-kjkml\") on node \"crc\" DevicePath \"\"" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.405433 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.405443 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da6e1269-a5c6-4f39-8d0a-b544de9522ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.751959 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" event={"ID":"da6e1269-a5c6-4f39-8d0a-b544de9522ba","Type":"ContainerDied","Data":"1e6ba363e7522286cde8bccfe89107870e71e92e90059fb25e647e713b9c8db9"} Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.751993 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e6ba363e7522286cde8bccfe89107870e71e92e90059fb25e647e713b9c8db9" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.752004 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-twq55" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.885839 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm"] Feb 21 22:10:34 crc kubenswrapper[4717]: E0221 22:10:34.886266 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6e1269-a5c6-4f39-8d0a-b544de9522ba" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.886290 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6e1269-a5c6-4f39-8d0a-b544de9522ba" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.886543 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6e1269-a5c6-4f39-8d0a-b544de9522ba" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.887402 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.890336 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.890417 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.890601 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.890677 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:10:34 crc kubenswrapper[4717]: I0221 22:10:34.911049 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm"] Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.019010 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.019467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.019624 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v4gn\" (UniqueName: \"kubernetes.io/projected/674f8569-62c3-477e-85af-13befe292f49-kube-api-access-7v4gn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.122170 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.122468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v4gn\" (UniqueName: \"kubernetes.io/projected/674f8569-62c3-477e-85af-13befe292f49-kube-api-access-7v4gn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.122654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.127611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.128009 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.142130 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v4gn\" (UniqueName: \"kubernetes.io/projected/674f8569-62c3-477e-85af-13befe292f49-kube-api-access-7v4gn\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.209206 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:10:35 crc kubenswrapper[4717]: I0221 22:10:35.748574 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm"] Feb 21 22:10:35 crc kubenswrapper[4717]: W0221 22:10:35.765315 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674f8569_62c3_477e_85af_13befe292f49.slice/crio-11fcee2f92110bf5912b1561f8a5c2367479c8c55326680bdf1f4c085cd9f4d7 WatchSource:0}: Error finding container 11fcee2f92110bf5912b1561f8a5c2367479c8c55326680bdf1f4c085cd9f4d7: Status 404 returned error can't find the container with id 11fcee2f92110bf5912b1561f8a5c2367479c8c55326680bdf1f4c085cd9f4d7 Feb 21 22:10:36 crc kubenswrapper[4717]: I0221 22:10:36.772061 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" event={"ID":"674f8569-62c3-477e-85af-13befe292f49","Type":"ContainerStarted","Data":"fe0b02a495767c6ac5b28a5bcabf06854d67da633f61f9b641a0a609dde2ab20"} Feb 21 22:10:36 crc kubenswrapper[4717]: I0221 22:10:36.772333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" event={"ID":"674f8569-62c3-477e-85af-13befe292f49","Type":"ContainerStarted","Data":"11fcee2f92110bf5912b1561f8a5c2367479c8c55326680bdf1f4c085cd9f4d7"} Feb 21 22:10:36 crc kubenswrapper[4717]: I0221 22:10:36.802020 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" podStartSLOduration=2.376430525 podStartE2EDuration="2.801977752s" podCreationTimestamp="2026-02-21 22:10:34 +0000 UTC" firstStartedPulling="2026-02-21 22:10:35.769240424 +0000 UTC m=+1450.550774066" lastFinishedPulling="2026-02-21 22:10:36.194787631 +0000 UTC m=+1450.976321293" observedRunningTime="2026-02-21 22:10:36.790246806 +0000 UTC m=+1451.571780468" watchObservedRunningTime="2026-02-21 22:10:36.801977752 +0000 UTC m=+1451.583511384" Feb 21 22:10:39 crc kubenswrapper[4717]: I0221 22:10:39.062662 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:10:39 crc kubenswrapper[4717]: I0221 22:10:39.063175 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:11:09 crc kubenswrapper[4717]: I0221 22:11:09.062341 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:11:09 crc kubenswrapper[4717]: I0221 22:11:09.063313 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:11:09 crc kubenswrapper[4717]: I0221 22:11:09.063398 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:11:09 crc kubenswrapper[4717]: I0221 22:11:09.064711 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:11:09 crc kubenswrapper[4717]: I0221 22:11:09.064840 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" gracePeriod=600 Feb 21 22:11:09 crc kubenswrapper[4717]: E0221 22:11:09.193191 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:11:10 crc kubenswrapper[4717]: I0221 22:11:10.139051 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" exitCode=0 Feb 21 22:11:10 crc kubenswrapper[4717]: I0221 22:11:10.139097 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d"} Feb 21 22:11:10 crc kubenswrapper[4717]: I0221 22:11:10.139134 4717 scope.go:117] "RemoveContainer" containerID="d1519f9b66326a62378129e107d074b9de1f98d8964cf8073c190668b31d38eb" Feb 21 22:11:10 crc kubenswrapper[4717]: I0221 22:11:10.140032 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:11:10 crc kubenswrapper[4717]: E0221 22:11:10.140573 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:11:21 crc kubenswrapper[4717]: I0221 22:11:21.982016 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:11:21 crc kubenswrapper[4717]: E0221 22:11:21.983004 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:11:28 crc kubenswrapper[4717]: I0221 22:11:28.054480 4717 scope.go:117] "RemoveContainer" containerID="881fe70bd9ae540109e6c45cd6673ff8e6d21acea9c120bba184d76f3aac40a7" Feb 21 22:11:28 crc kubenswrapper[4717]: I0221 22:11:28.086772 4717 scope.go:117] "RemoveContainer" containerID="9b5a44b32186bf0ff77932e89566cf94762797a604a2c2cb8f1b7fb6f249a9f0" Feb 21 22:11:28 crc kubenswrapper[4717]: I0221 22:11:28.164975 4717 scope.go:117] "RemoveContainer" containerID="28ef9f785f1ee78a724ecc166a77b5fd7488d03cd2140a5e56a87961a98c89d5" Feb 21 22:11:32 crc kubenswrapper[4717]: I0221 22:11:32.977425 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:11:32 crc kubenswrapper[4717]: E0221 22:11:32.978375 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:11:46 crc kubenswrapper[4717]: I0221 22:11:46.977743 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:11:46 crc kubenswrapper[4717]: E0221 22:11:46.978817 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:11:51 crc kubenswrapper[4717]: I0221 22:11:51.051604 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wcf2g"] Feb 21 22:11:51 crc kubenswrapper[4717]: I0221 22:11:51.069700 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0e51-account-create-update-xtqdq"] Feb 21 22:11:51 crc kubenswrapper[4717]: I0221 22:11:51.082065 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wcf2g"] Feb 21 22:11:51 crc kubenswrapper[4717]: I0221 22:11:51.098119 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0e51-account-create-update-xtqdq"] Feb 21 22:11:51 crc kubenswrapper[4717]: I0221 22:11:51.994332 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b97d305-a098-430c-9afd-36a981d3f978" path="/var/lib/kubelet/pods/4b97d305-a098-430c-9afd-36a981d3f978/volumes" Feb 21 22:11:51 crc kubenswrapper[4717]: I0221 22:11:51.995315 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7" path="/var/lib/kubelet/pods/517066d1-20f3-4bfd-a7dd-33cd6d3b5aa7/volumes" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.041932 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rms66"] Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.055402 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rms66"] Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.097524 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7qfrz"] Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.099953 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.123479 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qfrz"] Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.269182 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-utilities\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.269313 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvqm\" (UniqueName: \"kubernetes.io/projected/d3920142-ac52-4886-ae49-4fd7456b4cf4-kube-api-access-nfvqm\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.269714 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-catalog-content\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.372242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-catalog-content\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.372346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-utilities\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.372515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvqm\" (UniqueName: \"kubernetes.io/projected/d3920142-ac52-4886-ae49-4fd7456b4cf4-kube-api-access-nfvqm\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.372886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-catalog-content\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.372944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-utilities\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.394563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvqm\" (UniqueName: \"kubernetes.io/projected/d3920142-ac52-4886-ae49-4fd7456b4cf4-kube-api-access-nfvqm\") pod \"redhat-marketplace-7qfrz\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.433276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:11:52 crc kubenswrapper[4717]: I0221 22:11:52.911904 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qfrz"] Feb 21 22:11:53 crc kubenswrapper[4717]: I0221 22:11:53.038645 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fc8f-account-create-update-vfrcv"] Feb 21 22:11:53 crc kubenswrapper[4717]: I0221 22:11:53.045725 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fc8f-account-create-update-vfrcv"] Feb 21 22:11:53 crc kubenswrapper[4717]: I0221 22:11:53.637527 4717 generic.go:334] "Generic (PLEG): container finished" podID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerID="d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba" exitCode=0 Feb 21 22:11:53 crc kubenswrapper[4717]: I0221 22:11:53.639418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qfrz" event={"ID":"d3920142-ac52-4886-ae49-4fd7456b4cf4","Type":"ContainerDied","Data":"d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba"} Feb 21 22:11:53 crc kubenswrapper[4717]: I0221 22:11:53.641002 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qfrz" event={"ID":"d3920142-ac52-4886-ae49-4fd7456b4cf4","Type":"ContainerStarted","Data":"4163abc3c0f1f89e641a5c871c3c566fa53fee8122eacdc04ec7f6c135c4d0f5"} Feb 21 22:11:53 crc kubenswrapper[4717]: I0221 22:11:53.998468 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045ab09f-8170-4815-8af2-7511d0a29f3e" path="/var/lib/kubelet/pods/045ab09f-8170-4815-8af2-7511d0a29f3e/volumes" Feb 21 22:11:54 crc kubenswrapper[4717]: I0221 22:11:54.000129 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362d581c-08cc-41a5-9327-209994947262" path="/var/lib/kubelet/pods/362d581c-08cc-41a5-9327-209994947262/volumes" Feb 21 22:11:54 crc kubenswrapper[4717]: I0221 22:11:54.651448 4717 generic.go:334] "Generic (PLEG): container finished" podID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerID="49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce" exitCode=0 Feb 21 22:11:54 crc kubenswrapper[4717]: I0221 22:11:54.651536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qfrz" event={"ID":"d3920142-ac52-4886-ae49-4fd7456b4cf4","Type":"ContainerDied","Data":"49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce"} Feb 21 22:11:55 crc kubenswrapper[4717]: I0221 22:11:55.043239 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5lppt"] Feb 21 22:11:55 crc kubenswrapper[4717]: I0221 22:11:55.053722 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5lppt"] Feb 21 22:11:55 crc kubenswrapper[4717]: I0221 22:11:55.665819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qfrz" event={"ID":"d3920142-ac52-4886-ae49-4fd7456b4cf4","Type":"ContainerStarted","Data":"baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7"} Feb 21 22:11:55 crc kubenswrapper[4717]: I0221 22:11:55.691525 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7qfrz" podStartSLOduration=2.225644097 podStartE2EDuration="3.69150551s" podCreationTimestamp="2026-02-21 22:11:52 +0000 UTC" firstStartedPulling="2026-02-21 22:11:53.642282856 +0000 UTC m=+1528.423816508" lastFinishedPulling="2026-02-21 22:11:55.108144289 +0000 UTC m=+1529.889677921" observedRunningTime="2026-02-21 22:11:55.690290791 +0000 UTC m=+1530.471824443" watchObservedRunningTime="2026-02-21 22:11:55.69150551 +0000 UTC m=+1530.473039142" Feb 21 22:11:55 crc kubenswrapper[4717]: I0221 22:11:55.998004 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8b56e1-15ef-43f4-bc49-3dfe18978736" path="/var/lib/kubelet/pods/6c8b56e1-15ef-43f4-bc49-3dfe18978736/volumes" Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.036371 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2w6q9"] Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.047033 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9b8b-account-create-update-gf86p"] Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.057946 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2w6q9"] Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.068252 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9b8b-account-create-update-gf86p"] Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.976274 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:11:59 crc kubenswrapper[4717]: E0221 22:11:59.976625 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.989230 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fd9592-8cf7-4fda-a394-f0ad5efe2397" path="/var/lib/kubelet/pods/03fd9592-8cf7-4fda-a394-f0ad5efe2397/volumes" Feb 21 22:11:59 crc kubenswrapper[4717]: I0221 22:11:59.990565 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8357efe6-d264-4a56-902a-b7e443b93ac7" path="/var/lib/kubelet/pods/8357efe6-d264-4a56-902a-b7e443b93ac7/volumes" Feb 21 22:12:02 crc kubenswrapper[4717]: I0221 22:12:02.433820 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:12:02 crc kubenswrapper[4717]: I0221 22:12:02.435260 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:12:02 crc kubenswrapper[4717]: I0221 22:12:02.528116 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:12:02 crc kubenswrapper[4717]: I0221 22:12:02.842595 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:12:02 crc kubenswrapper[4717]: I0221 22:12:02.893411 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qfrz"] Feb 21 22:12:04 crc kubenswrapper[4717]: I0221 22:12:04.781252 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7qfrz" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="registry-server" containerID="cri-o://baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7" gracePeriod=2 Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.327761 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.450785 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-catalog-content\") pod \"d3920142-ac52-4886-ae49-4fd7456b4cf4\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.450932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-utilities\") pod \"d3920142-ac52-4886-ae49-4fd7456b4cf4\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.451072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvqm\" (UniqueName: \"kubernetes.io/projected/d3920142-ac52-4886-ae49-4fd7456b4cf4-kube-api-access-nfvqm\") pod \"d3920142-ac52-4886-ae49-4fd7456b4cf4\" (UID: \"d3920142-ac52-4886-ae49-4fd7456b4cf4\") " Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.452308 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-utilities" (OuterVolumeSpecName: "utilities") pod "d3920142-ac52-4886-ae49-4fd7456b4cf4" (UID: "d3920142-ac52-4886-ae49-4fd7456b4cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.456056 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3920142-ac52-4886-ae49-4fd7456b4cf4-kube-api-access-nfvqm" (OuterVolumeSpecName: "kube-api-access-nfvqm") pod "d3920142-ac52-4886-ae49-4fd7456b4cf4" (UID: "d3920142-ac52-4886-ae49-4fd7456b4cf4"). InnerVolumeSpecName "kube-api-access-nfvqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.492538 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3920142-ac52-4886-ae49-4fd7456b4cf4" (UID: "d3920142-ac52-4886-ae49-4fd7456b4cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.553160 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.553221 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3920142-ac52-4886-ae49-4fd7456b4cf4-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.553234 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfvqm\" (UniqueName: \"kubernetes.io/projected/d3920142-ac52-4886-ae49-4fd7456b4cf4-kube-api-access-nfvqm\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.796013 4717 generic.go:334] "Generic (PLEG): container finished" podID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerID="baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7" exitCode=0 Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.796068 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qfrz" event={"ID":"d3920142-ac52-4886-ae49-4fd7456b4cf4","Type":"ContainerDied","Data":"baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7"} Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.796138 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qfrz" event={"ID":"d3920142-ac52-4886-ae49-4fd7456b4cf4","Type":"ContainerDied","Data":"4163abc3c0f1f89e641a5c871c3c566fa53fee8122eacdc04ec7f6c135c4d0f5"} Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.796147 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qfrz" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.796177 4717 scope.go:117] "RemoveContainer" containerID="baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.845002 4717 scope.go:117] "RemoveContainer" containerID="49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.856019 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qfrz"] Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.872507 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qfrz"] Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.880717 4717 scope.go:117] "RemoveContainer" containerID="d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.931336 4717 scope.go:117] "RemoveContainer" containerID="baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7" Feb 21 22:12:05 crc kubenswrapper[4717]: E0221 22:12:05.932001 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7\": container with ID starting with baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7 not found: ID does not exist" containerID="baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.932070 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7"} err="failed to get container status \"baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7\": rpc error: code = NotFound desc = could not find container \"baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7\": container with ID starting with baaf0db81bcc38af025d0da9b2785f8dbd3b33039aed02908f10bf9765039eb7 not found: ID does not exist" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.932112 4717 scope.go:117] "RemoveContainer" containerID="49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce" Feb 21 22:12:05 crc kubenswrapper[4717]: E0221 22:12:05.932485 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce\": container with ID starting with 49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce not found: ID does not exist" containerID="49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.932525 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce"} err="failed to get container status \"49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce\": rpc error: code = NotFound desc = could not find container \"49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce\": container with ID starting with 49f0f42880749b8ba913e68bfb3ce31c0ec9ac435a34334fde08da6a21b646ce not found: ID does not exist" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.932551 4717 scope.go:117] "RemoveContainer" containerID="d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba" Feb 21 22:12:05 crc kubenswrapper[4717]: E0221 22:12:05.932968 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba\": container with ID starting with d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba not found: ID does not exist" containerID="d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.933062 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba"} err="failed to get container status \"d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba\": rpc error: code = NotFound desc = could not find container \"d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba\": container with ID starting with d487f6ade995127d27ec86517fa2499849d818f66d0b7f034c2f04d8b9285aba not found: ID does not exist" Feb 21 22:12:05 crc kubenswrapper[4717]: I0221 22:12:05.998970 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" path="/var/lib/kubelet/pods/d3920142-ac52-4886-ae49-4fd7456b4cf4/volumes" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.202068 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x88rg"] Feb 21 22:12:08 crc kubenswrapper[4717]: E0221 22:12:08.212030 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="extract-content" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.212084 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="extract-content" Feb 21 22:12:08 crc kubenswrapper[4717]: E0221 22:12:08.212175 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="extract-utilities" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.212183 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="extract-utilities" Feb 21 22:12:08 crc kubenswrapper[4717]: E0221 22:12:08.212220 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="registry-server" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.212245 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="registry-server" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.213187 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3920142-ac52-4886-ae49-4fd7456b4cf4" containerName="registry-server" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.216067 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.230100 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x88rg"] Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.318579 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-utilities\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.318820 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4jf\" (UniqueName: \"kubernetes.io/projected/83ec667a-a90b-42b0-8ac6-c305282e7121-kube-api-access-lv4jf\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.318934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-catalog-content\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.420264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-catalog-content\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.420429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-utilities\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.420509 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv4jf\" (UniqueName: \"kubernetes.io/projected/83ec667a-a90b-42b0-8ac6-c305282e7121-kube-api-access-lv4jf\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.420849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-catalog-content\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.421023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-utilities\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.449975 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv4jf\" (UniqueName: \"kubernetes.io/projected/83ec667a-a90b-42b0-8ac6-c305282e7121-kube-api-access-lv4jf\") pod \"certified-operators-x88rg\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.548505 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:08 crc kubenswrapper[4717]: I0221 22:12:08.996952 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x88rg"] Feb 21 22:12:09 crc kubenswrapper[4717]: I0221 22:12:09.839798 4717 generic.go:334] "Generic (PLEG): container finished" podID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerID="236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01" exitCode=0 Feb 21 22:12:09 crc kubenswrapper[4717]: I0221 22:12:09.839857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerDied","Data":"236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01"} Feb 21 22:12:09 crc kubenswrapper[4717]: I0221 22:12:09.839930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerStarted","Data":"5c694ef7a735de8eb9b250d0fd16504aa0223d940cd1530a9b615b0600ef4602"} Feb 21 22:12:09 crc kubenswrapper[4717]: I0221 22:12:09.842009 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:12:10 crc kubenswrapper[4717]: I0221 22:12:10.853706 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerStarted","Data":"4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3"} Feb 21 22:12:11 crc kubenswrapper[4717]: I0221 22:12:11.866845 4717 generic.go:334] "Generic (PLEG): container finished" podID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerID="4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3" exitCode=0 Feb 21 22:12:11 crc kubenswrapper[4717]: I0221 22:12:11.866935 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerDied","Data":"4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3"} Feb 21 22:12:11 crc kubenswrapper[4717]: I0221 22:12:11.980854 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:12:11 crc kubenswrapper[4717]: E0221 22:12:11.982099 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:12:12 crc kubenswrapper[4717]: I0221 22:12:12.881277 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerStarted","Data":"11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a"} Feb 21 22:12:12 crc kubenswrapper[4717]: I0221 22:12:12.909009 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x88rg" podStartSLOduration=2.2036568340000002 podStartE2EDuration="4.908980041s" podCreationTimestamp="2026-02-21 22:12:08 +0000 UTC" firstStartedPulling="2026-02-21 22:12:09.841698505 +0000 UTC m=+1544.623232137" lastFinishedPulling="2026-02-21 22:12:12.547021692 +0000 UTC m=+1547.328555344" observedRunningTime="2026-02-21 22:12:12.905306425 +0000 UTC m=+1547.686840077" watchObservedRunningTime="2026-02-21 22:12:12.908980041 +0000 UTC m=+1547.690513653" Feb 21 22:12:17 crc kubenswrapper[4717]: E0221 22:12:17.324835 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674f8569_62c3_477e_85af_13befe292f49.slice/crio-conmon-fe0b02a495767c6ac5b28a5bcabf06854d67da633f61f9b641a0a609dde2ab20.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674f8569_62c3_477e_85af_13befe292f49.slice/crio-fe0b02a495767c6ac5b28a5bcabf06854d67da633f61f9b641a0a609dde2ab20.scope\": RecentStats: unable to find data in memory cache]" Feb 21 22:12:17 crc kubenswrapper[4717]: I0221 22:12:17.975660 4717 generic.go:334] "Generic (PLEG): container finished" podID="674f8569-62c3-477e-85af-13befe292f49" containerID="fe0b02a495767c6ac5b28a5bcabf06854d67da633f61f9b641a0a609dde2ab20" exitCode=0 Feb 21 22:12:17 crc kubenswrapper[4717]: I0221 22:12:17.988680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" event={"ID":"674f8569-62c3-477e-85af-13befe292f49","Type":"ContainerDied","Data":"fe0b02a495767c6ac5b28a5bcabf06854d67da633f61f9b641a0a609dde2ab20"} Feb 21 22:12:18 crc kubenswrapper[4717]: I0221 22:12:18.549723 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:18 crc kubenswrapper[4717]: I0221 22:12:18.550759 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:18 crc kubenswrapper[4717]: I0221 22:12:18.626756 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.089568 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.156071 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x88rg"] Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.413707 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.553611 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-ssh-key-openstack-edpm-ipam\") pod \"674f8569-62c3-477e-85af-13befe292f49\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.553695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v4gn\" (UniqueName: \"kubernetes.io/projected/674f8569-62c3-477e-85af-13befe292f49-kube-api-access-7v4gn\") pod \"674f8569-62c3-477e-85af-13befe292f49\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.553903 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-inventory\") pod \"674f8569-62c3-477e-85af-13befe292f49\" (UID: \"674f8569-62c3-477e-85af-13befe292f49\") " Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.564010 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674f8569-62c3-477e-85af-13befe292f49-kube-api-access-7v4gn" (OuterVolumeSpecName: "kube-api-access-7v4gn") pod "674f8569-62c3-477e-85af-13befe292f49" (UID: "674f8569-62c3-477e-85af-13befe292f49"). InnerVolumeSpecName "kube-api-access-7v4gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.582665 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-inventory" (OuterVolumeSpecName: "inventory") pod "674f8569-62c3-477e-85af-13befe292f49" (UID: "674f8569-62c3-477e-85af-13befe292f49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.593009 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "674f8569-62c3-477e-85af-13befe292f49" (UID: "674f8569-62c3-477e-85af-13befe292f49"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.656205 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.656247 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/674f8569-62c3-477e-85af-13befe292f49-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:19 crc kubenswrapper[4717]: I0221 22:12:19.656262 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v4gn\" (UniqueName: \"kubernetes.io/projected/674f8569-62c3-477e-85af-13befe292f49-kube-api-access-7v4gn\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.000727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" event={"ID":"674f8569-62c3-477e-85af-13befe292f49","Type":"ContainerDied","Data":"11fcee2f92110bf5912b1561f8a5c2367479c8c55326680bdf1f4c085cd9f4d7"} Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.000767 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11fcee2f92110bf5912b1561f8a5c2367479c8c55326680bdf1f4c085cd9f4d7" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.000770 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.135242 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk"] Feb 21 22:12:20 crc kubenswrapper[4717]: E0221 22:12:20.136076 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674f8569-62c3-477e-85af-13befe292f49" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.136101 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="674f8569-62c3-477e-85af-13befe292f49" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.136335 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="674f8569-62c3-477e-85af-13befe292f49" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.137222 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.141327 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.141651 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.141824 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.141940 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.160044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk"] Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.265771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.265831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.266040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2znn\" (UniqueName: \"kubernetes.io/projected/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-kube-api-access-q2znn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.368087 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.368197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.368325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2znn\" (UniqueName: \"kubernetes.io/projected/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-kube-api-access-q2znn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.373076 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.384302 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.385845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2znn\" (UniqueName: \"kubernetes.io/projected/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-kube-api-access-q2znn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:20 crc kubenswrapper[4717]: I0221 22:12:20.467794 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.009419 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x88rg" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="registry-server" containerID="cri-o://11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a" gracePeriod=2 Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.056397 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk"] Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.465972 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.596877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-catalog-content\") pod \"83ec667a-a90b-42b0-8ac6-c305282e7121\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.596931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-utilities\") pod \"83ec667a-a90b-42b0-8ac6-c305282e7121\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.597100 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv4jf\" (UniqueName: \"kubernetes.io/projected/83ec667a-a90b-42b0-8ac6-c305282e7121-kube-api-access-lv4jf\") pod \"83ec667a-a90b-42b0-8ac6-c305282e7121\" (UID: \"83ec667a-a90b-42b0-8ac6-c305282e7121\") " Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.597725 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-utilities" (OuterVolumeSpecName: "utilities") pod "83ec667a-a90b-42b0-8ac6-c305282e7121" (UID: "83ec667a-a90b-42b0-8ac6-c305282e7121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.604441 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ec667a-a90b-42b0-8ac6-c305282e7121-kube-api-access-lv4jf" (OuterVolumeSpecName: "kube-api-access-lv4jf") pod "83ec667a-a90b-42b0-8ac6-c305282e7121" (UID: "83ec667a-a90b-42b0-8ac6-c305282e7121"). InnerVolumeSpecName "kube-api-access-lv4jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.673527 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83ec667a-a90b-42b0-8ac6-c305282e7121" (UID: "83ec667a-a90b-42b0-8ac6-c305282e7121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.699724 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.699757 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ec667a-a90b-42b0-8ac6-c305282e7121-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:21 crc kubenswrapper[4717]: I0221 22:12:21.699767 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv4jf\" (UniqueName: \"kubernetes.io/projected/83ec667a-a90b-42b0-8ac6-c305282e7121-kube-api-access-lv4jf\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.023487 4717 generic.go:334] "Generic (PLEG): container finished" podID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerID="11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a" exitCode=0 Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.023570 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x88rg" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.023573 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerDied","Data":"11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a"} Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.023756 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x88rg" event={"ID":"83ec667a-a90b-42b0-8ac6-c305282e7121","Type":"ContainerDied","Data":"5c694ef7a735de8eb9b250d0fd16504aa0223d940cd1530a9b615b0600ef4602"} Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.023796 4717 scope.go:117] "RemoveContainer" containerID="11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.026760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" event={"ID":"79339bc9-6d8a-4fe5-ba8d-37643afe6d98","Type":"ContainerStarted","Data":"a2a3067e1813f770d9054c4a587b2462798f6bd99f7bfeb58b708ca0ac23e7b3"} Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.026800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" event={"ID":"79339bc9-6d8a-4fe5-ba8d-37643afe6d98","Type":"ContainerStarted","Data":"deeb190849fa3d73040e1be4616c1cfb18b1e0a1aa4e0446b4d6bc9011a7342e"} Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.045748 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-g8gpn"] Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.062821 4717 scope.go:117] "RemoveContainer" containerID="4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.067872 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vwk86"] Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.087612 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-g8gpn"] Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.094680 4717 scope.go:117] "RemoveContainer" containerID="236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.096090 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vwk86"] Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.100081 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" podStartSLOduration=1.41598596 podStartE2EDuration="2.10004403s" podCreationTimestamp="2026-02-21 22:12:20 +0000 UTC" firstStartedPulling="2026-02-21 22:12:21.06130041 +0000 UTC m=+1555.842834072" lastFinishedPulling="2026-02-21 22:12:21.74535852 +0000 UTC m=+1556.526892142" observedRunningTime="2026-02-21 22:12:22.054438336 +0000 UTC m=+1556.835971958" watchObservedRunningTime="2026-02-21 22:12:22.10004403 +0000 UTC m=+1556.881577652" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.118779 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x88rg"] Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.121889 4717 scope.go:117] "RemoveContainer" containerID="11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a" Feb 21 22:12:22 crc kubenswrapper[4717]: E0221 22:12:22.122610 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a\": container with ID starting with 11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a not found: ID does not exist" containerID="11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.122683 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a"} err="failed to get container status \"11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a\": rpc error: code = NotFound desc = could not find container \"11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a\": container with ID starting with 11a817c636e125a0097fe838513020feae5d8558735cf98f7543f6f6a2b6166a not found: ID does not exist" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.122728 4717 scope.go:117] "RemoveContainer" containerID="4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3" Feb 21 22:12:22 crc kubenswrapper[4717]: E0221 22:12:22.123228 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3\": container with ID starting with 4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3 not found: ID does not exist" containerID="4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.123282 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3"} err="failed to get container status \"4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3\": rpc error: code = NotFound desc = could not find container \"4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3\": container with ID starting with 4fb1ec8bae2384bca5e621ac3b3b127c36eeda29c5e3745233246682b1ebddc3 not found: ID does not exist" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.123315 4717 scope.go:117] "RemoveContainer" containerID="236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01" Feb 21 22:12:22 crc kubenswrapper[4717]: E0221 22:12:22.123968 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01\": container with ID starting with 236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01 not found: ID does not exist" containerID="236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.124001 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01"} err="failed to get container status \"236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01\": rpc error: code = NotFound desc = could not find container \"236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01\": container with ID starting with 236e4b4adc1bd434cb0f9f3967b0823f8ef32287b3ca9be651d392d78bad3f01 not found: ID does not exist" Feb 21 22:12:22 crc kubenswrapper[4717]: I0221 22:12:22.128165 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x88rg"] Feb 21 22:12:24 crc kubenswrapper[4717]: I0221 22:12:24.001066 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" path="/var/lib/kubelet/pods/83ec667a-a90b-42b0-8ac6-c305282e7121/volumes" Feb 21 22:12:24 crc kubenswrapper[4717]: I0221 22:12:24.003121 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e4b601-cb61-433b-8e06-cbd920071fc5" path="/var/lib/kubelet/pods/b1e4b601-cb61-433b-8e06-cbd920071fc5/volumes" Feb 21 22:12:24 crc kubenswrapper[4717]: I0221 22:12:24.004712 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff13878c-357b-4a39-8b6b-f4f6e1929fed" path="/var/lib/kubelet/pods/ff13878c-357b-4a39-8b6b-f4f6e1929fed/volumes" Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.050021 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e2b5-account-create-update-bq26b"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.075620 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d9ae-account-create-update-69mhh"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.091345 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d9ae-account-create-update-69mhh"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.105478 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b660-account-create-update-5brnn"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.117799 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e2b5-account-create-update-bq26b"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.127449 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hmzb4"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.139369 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b660-account-create-update-5brnn"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.151547 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hmzb4"] Feb 21 22:12:25 crc kubenswrapper[4717]: I0221 22:12:25.988484 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:12:25 crc kubenswrapper[4717]: E0221 22:12:25.988824 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:12:26 crc kubenswrapper[4717]: I0221 22:12:26.000766 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040a8750-e237-47d5-8b7b-8f310c436b87" path="/var/lib/kubelet/pods/040a8750-e237-47d5-8b7b-8f310c436b87/volumes" Feb 21 22:12:26 crc kubenswrapper[4717]: I0221 22:12:26.003033 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a464ef-0333-4c29-a6a7-8af81a592e0b" path="/var/lib/kubelet/pods/86a464ef-0333-4c29-a6a7-8af81a592e0b/volumes" Feb 21 22:12:26 crc kubenswrapper[4717]: I0221 22:12:26.003617 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7dda56c-0b65-46d4-9f88-dda7c73423da" path="/var/lib/kubelet/pods/b7dda56c-0b65-46d4-9f88-dda7c73423da/volumes" Feb 21 22:12:26 crc kubenswrapper[4717]: I0221 22:12:26.004152 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd80c486-9397-43f2-ba5e-c6f868a2a47a" path="/var/lib/kubelet/pods/dd80c486-9397-43f2-ba5e-c6f868a2a47a/volumes" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.261894 4717 scope.go:117] "RemoveContainer" containerID="8405cf1ce90dabded9e1452bc6e06c60e776fad26ceb6f912e58175ac3b1c571" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.303161 4717 scope.go:117] "RemoveContainer" containerID="178405d965dfcf5114b5e461d8ca83be0a88edcc7fde6c38357ea3a03e042d43" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.355653 4717 scope.go:117] "RemoveContainer" containerID="dd54f39fffd116541fe63ed6f01871dda4f9451975d8377afbd79a1fd91e372c" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.403288 4717 scope.go:117] "RemoveContainer" containerID="e6c2a0752dc5328f8b1331059bc4c3578c5cc53cd20854e0fc6cf06f7e5ac335" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.458540 4717 scope.go:117] "RemoveContainer" containerID="d030300c9eaa4bb2ebe31023b3dcacc5c3e7cd4b2f3bb3489184bc74fcda5bb6" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.487440 4717 scope.go:117] "RemoveContainer" containerID="75cf48369ea01327d45e0794cbeacde6407dba313b0557a0f9009b1ce4f0b902" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.527848 4717 scope.go:117] "RemoveContainer" containerID="452e5f9e478d34e1dd34d9506c5322fba26c49d545d7c820c5e7dc01b782c588" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.561845 4717 scope.go:117] "RemoveContainer" containerID="417b05e1b9da0cce1fb21ba9078fac3a79e4d41ca454e02a59213ad77b3cb3c1" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.583321 4717 scope.go:117] "RemoveContainer" containerID="fc7cd4ee13db69b9236e1f63d473984ae556a720a7e31a844c32a4ad4f852bf8" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.603050 4717 scope.go:117] "RemoveContainer" containerID="66f7b17be06f663f8dd046236b2bbd86a3a97c9fef06057e31fbd5121aa231a7" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.632113 4717 scope.go:117] "RemoveContainer" containerID="b01bbd660892dea72dca718004e62bd76fcfb0330af74253832932721da9a9ea" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.668583 4717 scope.go:117] "RemoveContainer" containerID="6a7d6872e80a0e2c9b9165696c47af427bc8a25cec42384f6ae03706d9ae6e32" Feb 21 22:12:28 crc kubenswrapper[4717]: I0221 22:12:28.711785 4717 scope.go:117] "RemoveContainer" containerID="03287299a3fb78fd55b94a2243cf8487ef5a67bb62539551a3c8485918e6ca3e" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.048001 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7m8mk"] Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.065717 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7m8mk"] Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.896231 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4g88v"] Feb 21 22:12:29 crc kubenswrapper[4717]: E0221 22:12:29.897543 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="extract-utilities" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.897659 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="extract-utilities" Feb 21 22:12:29 crc kubenswrapper[4717]: E0221 22:12:29.897748 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="registry-server" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.897846 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="registry-server" Feb 21 22:12:29 crc kubenswrapper[4717]: E0221 22:12:29.897970 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="extract-content" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.898048 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="extract-content" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.898337 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ec667a-a90b-42b0-8ac6-c305282e7121" containerName="registry-server" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.900171 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.916353 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4g88v"] Feb 21 22:12:29 crc kubenswrapper[4717]: I0221 22:12:29.987666 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b03743-48fc-4006-8cfd-b912deba0232" path="/var/lib/kubelet/pods/99b03743-48fc-4006-8cfd-b912deba0232/volumes" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.085435 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-utilities\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.085748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-catalog-content\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.085800 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtd76\" (UniqueName: \"kubernetes.io/projected/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-kube-api-access-wtd76\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.187180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-catalog-content\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.187239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtd76\" (UniqueName: \"kubernetes.io/projected/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-kube-api-access-wtd76\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.187275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-utilities\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.187773 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-utilities\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.187777 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-catalog-content\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.213923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtd76\" (UniqueName: \"kubernetes.io/projected/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-kube-api-access-wtd76\") pod \"redhat-operators-4g88v\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.232646 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:30 crc kubenswrapper[4717]: I0221 22:12:30.729285 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4g88v"] Feb 21 22:12:31 crc kubenswrapper[4717]: I0221 22:12:31.139497 4717 generic.go:334] "Generic (PLEG): container finished" podID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerID="d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d" exitCode=0 Feb 21 22:12:31 crc kubenswrapper[4717]: I0221 22:12:31.139535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerDied","Data":"d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d"} Feb 21 22:12:31 crc kubenswrapper[4717]: I0221 22:12:31.139558 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerStarted","Data":"5f6bdd14da651f25c4730ec9509021dbe77d3ab69d88fddafa0ca30e67a63fe9"} Feb 21 22:12:33 crc kubenswrapper[4717]: I0221 22:12:33.161038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerStarted","Data":"53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f"} Feb 21 22:12:34 crc kubenswrapper[4717]: I0221 22:12:34.172897 4717 generic.go:334] "Generic (PLEG): container finished" podID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerID="53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f" exitCode=0 Feb 21 22:12:34 crc kubenswrapper[4717]: I0221 22:12:34.173031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerDied","Data":"53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f"} Feb 21 22:12:35 crc kubenswrapper[4717]: I0221 22:12:35.184105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerStarted","Data":"dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e"} Feb 21 22:12:35 crc kubenswrapper[4717]: I0221 22:12:35.212203 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4g88v" podStartSLOduration=2.781780127 podStartE2EDuration="6.212184522s" podCreationTimestamp="2026-02-21 22:12:29 +0000 UTC" firstStartedPulling="2026-02-21 22:12:31.141330052 +0000 UTC m=+1565.922863674" lastFinishedPulling="2026-02-21 22:12:34.571734407 +0000 UTC m=+1569.353268069" observedRunningTime="2026-02-21 22:12:35.206391025 +0000 UTC m=+1569.987924647" watchObservedRunningTime="2026-02-21 22:12:35.212184522 +0000 UTC m=+1569.993718144" Feb 21 22:12:38 crc kubenswrapper[4717]: I0221 22:12:38.977513 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:12:38 crc kubenswrapper[4717]: E0221 22:12:38.978174 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:12:40 crc kubenswrapper[4717]: I0221 22:12:40.233182 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:40 crc kubenswrapper[4717]: I0221 22:12:40.233223 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:41 crc kubenswrapper[4717]: I0221 22:12:41.297202 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4g88v" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="registry-server" probeResult="failure" output=< Feb 21 22:12:41 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 22:12:41 crc kubenswrapper[4717]: > Feb 21 22:12:50 crc kubenswrapper[4717]: I0221 22:12:50.327565 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:50 crc kubenswrapper[4717]: I0221 22:12:50.388084 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:50 crc kubenswrapper[4717]: I0221 22:12:50.572248 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4g88v"] Feb 21 22:12:52 crc kubenswrapper[4717]: I0221 22:12:52.347099 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4g88v" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="registry-server" containerID="cri-o://dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e" gracePeriod=2 Feb 21 22:12:52 crc kubenswrapper[4717]: I0221 22:12:52.909135 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:52 crc kubenswrapper[4717]: I0221 22:12:52.976954 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:12:52 crc kubenswrapper[4717]: E0221 22:12:52.977295 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.051981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtd76\" (UniqueName: \"kubernetes.io/projected/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-kube-api-access-wtd76\") pod \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.052053 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-catalog-content\") pod \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.052095 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-utilities\") pod \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\" (UID: \"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc\") " Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.053417 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-utilities" (OuterVolumeSpecName: "utilities") pod "c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" (UID: "c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.065551 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-kube-api-access-wtd76" (OuterVolumeSpecName: "kube-api-access-wtd76") pod "c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" (UID: "c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc"). InnerVolumeSpecName "kube-api-access-wtd76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.154208 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtd76\" (UniqueName: \"kubernetes.io/projected/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-kube-api-access-wtd76\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.154238 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.190575 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" (UID: "c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.255877 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.364447 4717 generic.go:334] "Generic (PLEG): container finished" podID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerID="dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e" exitCode=0 Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.364528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerDied","Data":"dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e"} Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.364571 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g88v" event={"ID":"c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc","Type":"ContainerDied","Data":"5f6bdd14da651f25c4730ec9509021dbe77d3ab69d88fddafa0ca30e67a63fe9"} Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.364602 4717 scope.go:117] "RemoveContainer" containerID="dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.364812 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g88v" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.414927 4717 scope.go:117] "RemoveContainer" containerID="53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.439420 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4g88v"] Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.448056 4717 scope.go:117] "RemoveContainer" containerID="d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.455414 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4g88v"] Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.544514 4717 scope.go:117] "RemoveContainer" containerID="dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e" Feb 21 22:12:53 crc kubenswrapper[4717]: E0221 22:12:53.545032 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e\": container with ID starting with dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e not found: ID does not exist" containerID="dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.545082 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e"} err="failed to get container status \"dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e\": rpc error: code = NotFound desc = could not find container \"dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e\": container with ID starting with dc9eaa590662f0dc8c585d87ed233000361dd988d3d6fe435664a276a6dc7b5e not found: ID does not exist" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.545120 4717 scope.go:117] "RemoveContainer" containerID="53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f" Feb 21 22:12:53 crc kubenswrapper[4717]: E0221 22:12:53.545423 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f\": container with ID starting with 53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f not found: ID does not exist" containerID="53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.545462 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f"} err="failed to get container status \"53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f\": rpc error: code = NotFound desc = could not find container \"53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f\": container with ID starting with 53a908c67435a0b53c7a53b4d2561789c0370ef949e7629d471367e846911f9f not found: ID does not exist" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.545488 4717 scope.go:117] "RemoveContainer" containerID="d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d" Feb 21 22:12:53 crc kubenswrapper[4717]: E0221 22:12:53.545887 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d\": container with ID starting with d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d not found: ID does not exist" containerID="d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.545925 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d"} err="failed to get container status \"d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d\": rpc error: code = NotFound desc = could not find container \"d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d\": container with ID starting with d74202fd15c40407741438afdf09514965fbd24e7dfbadbe3d3f8f289f9d595d not found: ID does not exist" Feb 21 22:12:53 crc kubenswrapper[4717]: I0221 22:12:53.986147 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" path="/var/lib/kubelet/pods/c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc/volumes" Feb 21 22:12:58 crc kubenswrapper[4717]: I0221 22:12:58.043698 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-v94db"] Feb 21 22:12:58 crc kubenswrapper[4717]: I0221 22:12:58.051273 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4ttdv"] Feb 21 22:12:58 crc kubenswrapper[4717]: I0221 22:12:58.059266 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-v94db"] Feb 21 22:12:58 crc kubenswrapper[4717]: I0221 22:12:58.068329 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4ttdv"] Feb 21 22:12:59 crc kubenswrapper[4717]: I0221 22:12:59.990392 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7995c515-84a0-44d3-82e8-99a2ab1fb7b2" path="/var/lib/kubelet/pods/7995c515-84a0-44d3-82e8-99a2ab1fb7b2/volumes" Feb 21 22:12:59 crc kubenswrapper[4717]: I0221 22:12:59.991541 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb19e52-5b9c-478e-9e86-cc5529c2a6d7" path="/var/lib/kubelet/pods/fcb19e52-5b9c-478e-9e86-cc5529c2a6d7/volumes" Feb 21 22:13:03 crc kubenswrapper[4717]: I0221 22:13:03.976983 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:13:03 crc kubenswrapper[4717]: E0221 22:13:03.977820 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:13:04 crc kubenswrapper[4717]: I0221 22:13:04.048342 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zsxs4"] Feb 21 22:13:04 crc kubenswrapper[4717]: I0221 22:13:04.059430 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zsxs4"] Feb 21 22:13:05 crc kubenswrapper[4717]: I0221 22:13:05.988303 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38c89d0-4315-4d98-86bc-570662736bba" path="/var/lib/kubelet/pods/d38c89d0-4315-4d98-86bc-570662736bba/volumes" Feb 21 22:13:11 crc kubenswrapper[4717]: I0221 22:13:11.061263 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rrpvp"] Feb 21 22:13:11 crc kubenswrapper[4717]: I0221 22:13:11.078566 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rrpvp"] Feb 21 22:13:11 crc kubenswrapper[4717]: I0221 22:13:11.998972 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a647fe81-8c83-4f0c-996b-1a71081700f0" path="/var/lib/kubelet/pods/a647fe81-8c83-4f0c-996b-1a71081700f0/volumes" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.384016 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ssqb"] Feb 21 22:13:14 crc kubenswrapper[4717]: E0221 22:13:14.387602 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="registry-server" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.387638 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="registry-server" Feb 21 22:13:14 crc kubenswrapper[4717]: E0221 22:13:14.387656 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="extract-utilities" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.387665 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="extract-utilities" Feb 21 22:13:14 crc kubenswrapper[4717]: E0221 22:13:14.387725 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="extract-content" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.387734 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="extract-content" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.388057 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99f7701-7cf7-44fe-aabc-36ea7ae3b9fc" containerName="registry-server" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.389745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.407486 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ssqb"] Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.523980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-catalog-content\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.524079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswbh\" (UniqueName: \"kubernetes.io/projected/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-kube-api-access-vswbh\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.524208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-utilities\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.626031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-utilities\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.626201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-catalog-content\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.626299 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswbh\" (UniqueName: \"kubernetes.io/projected/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-kube-api-access-vswbh\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.626928 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-catalog-content\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.626949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-utilities\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.659974 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswbh\" (UniqueName: \"kubernetes.io/projected/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-kube-api-access-vswbh\") pod \"community-operators-8ssqb\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:14 crc kubenswrapper[4717]: I0221 22:13:14.726828 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:15 crc kubenswrapper[4717]: I0221 22:13:15.272727 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ssqb"] Feb 21 22:13:15 crc kubenswrapper[4717]: I0221 22:13:15.760564 4717 generic.go:334] "Generic (PLEG): container finished" podID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerID="4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9" exitCode=0 Feb 21 22:13:15 crc kubenswrapper[4717]: I0221 22:13:15.760791 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ssqb" event={"ID":"65cb61db-ef1b-4173-9fa6-a79ebdc64d43","Type":"ContainerDied","Data":"4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9"} Feb 21 22:13:15 crc kubenswrapper[4717]: I0221 22:13:15.760894 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ssqb" event={"ID":"65cb61db-ef1b-4173-9fa6-a79ebdc64d43","Type":"ContainerStarted","Data":"6e1f871eea217919c95d76ab9d6040d42d9af3c643390def81624a89d45ac30e"} Feb 21 22:13:15 crc kubenswrapper[4717]: I0221 22:13:15.983849 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:13:15 crc kubenswrapper[4717]: E0221 22:13:15.984222 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:13:17 crc kubenswrapper[4717]: I0221 22:13:17.791134 4717 generic.go:334] "Generic (PLEG): container finished" podID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerID="58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673" exitCode=0 Feb 21 22:13:17 crc kubenswrapper[4717]: I0221 22:13:17.791217 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ssqb" event={"ID":"65cb61db-ef1b-4173-9fa6-a79ebdc64d43","Type":"ContainerDied","Data":"58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673"} Feb 21 22:13:18 crc kubenswrapper[4717]: I0221 22:13:18.813013 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ssqb" event={"ID":"65cb61db-ef1b-4173-9fa6-a79ebdc64d43","Type":"ContainerStarted","Data":"1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882"} Feb 21 22:13:18 crc kubenswrapper[4717]: I0221 22:13:18.843384 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ssqb" podStartSLOduration=2.363210497 podStartE2EDuration="4.843365105s" podCreationTimestamp="2026-02-21 22:13:14 +0000 UTC" firstStartedPulling="2026-02-21 22:13:15.762804675 +0000 UTC m=+1610.544338297" lastFinishedPulling="2026-02-21 22:13:18.242959273 +0000 UTC m=+1613.024492905" observedRunningTime="2026-02-21 22:13:18.840433575 +0000 UTC m=+1613.621967207" watchObservedRunningTime="2026-02-21 22:13:18.843365105 +0000 UTC m=+1613.624898737" Feb 21 22:13:20 crc kubenswrapper[4717]: I0221 22:13:20.047366 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6xk4j"] Feb 21 22:13:20 crc kubenswrapper[4717]: I0221 22:13:20.065954 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6xk4j"] Feb 21 22:13:21 crc kubenswrapper[4717]: I0221 22:13:21.038693 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2h82m"] Feb 21 22:13:21 crc kubenswrapper[4717]: I0221 22:13:21.051235 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2h82m"] Feb 21 22:13:22 crc kubenswrapper[4717]: I0221 22:13:22.001007 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3727ff36-57dd-4c91-ab08-d5c87ee4e357" path="/var/lib/kubelet/pods/3727ff36-57dd-4c91-ab08-d5c87ee4e357/volumes" Feb 21 22:13:22 crc kubenswrapper[4717]: I0221 22:13:22.002698 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a945001c-fdf1-4bda-8012-3df96d9781ce" path="/var/lib/kubelet/pods/a945001c-fdf1-4bda-8012-3df96d9781ce/volumes" Feb 21 22:13:24 crc kubenswrapper[4717]: I0221 22:13:24.727317 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:24 crc kubenswrapper[4717]: I0221 22:13:24.727777 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:24 crc kubenswrapper[4717]: I0221 22:13:24.797947 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:24 crc kubenswrapper[4717]: I0221 22:13:24.957748 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:25 crc kubenswrapper[4717]: I0221 22:13:25.035258 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ssqb"] Feb 21 22:13:26 crc kubenswrapper[4717]: I0221 22:13:26.904079 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ssqb" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="registry-server" containerID="cri-o://1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882" gracePeriod=2 Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.438386 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.546362 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vswbh\" (UniqueName: \"kubernetes.io/projected/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-kube-api-access-vswbh\") pod \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.546487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-utilities\") pod \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.546840 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-catalog-content\") pod \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\" (UID: \"65cb61db-ef1b-4173-9fa6-a79ebdc64d43\") " Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.547369 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-utilities" (OuterVolumeSpecName: "utilities") pod "65cb61db-ef1b-4173-9fa6-a79ebdc64d43" (UID: "65cb61db-ef1b-4173-9fa6-a79ebdc64d43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.548005 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.551633 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-kube-api-access-vswbh" (OuterVolumeSpecName: "kube-api-access-vswbh") pod "65cb61db-ef1b-4173-9fa6-a79ebdc64d43" (UID: "65cb61db-ef1b-4173-9fa6-a79ebdc64d43"). InnerVolumeSpecName "kube-api-access-vswbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.650725 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vswbh\" (UniqueName: \"kubernetes.io/projected/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-kube-api-access-vswbh\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.744414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65cb61db-ef1b-4173-9fa6-a79ebdc64d43" (UID: "65cb61db-ef1b-4173-9fa6-a79ebdc64d43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.752153 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65cb61db-ef1b-4173-9fa6-a79ebdc64d43-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.917274 4717 generic.go:334] "Generic (PLEG): container finished" podID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerID="1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882" exitCode=0 Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.917315 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ssqb" event={"ID":"65cb61db-ef1b-4173-9fa6-a79ebdc64d43","Type":"ContainerDied","Data":"1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882"} Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.917368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ssqb" event={"ID":"65cb61db-ef1b-4173-9fa6-a79ebdc64d43","Type":"ContainerDied","Data":"6e1f871eea217919c95d76ab9d6040d42d9af3c643390def81624a89d45ac30e"} Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.917384 4717 scope.go:117] "RemoveContainer" containerID="1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.918691 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ssqb" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.960998 4717 scope.go:117] "RemoveContainer" containerID="58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673" Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.972079 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ssqb"] Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.992991 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ssqb"] Feb 21 22:13:27 crc kubenswrapper[4717]: I0221 22:13:27.993895 4717 scope.go:117] "RemoveContainer" containerID="4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9" Feb 21 22:13:28 crc kubenswrapper[4717]: I0221 22:13:28.042855 4717 scope.go:117] "RemoveContainer" containerID="1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882" Feb 21 22:13:28 crc kubenswrapper[4717]: E0221 22:13:28.043379 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882\": container with ID starting with 1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882 not found: ID does not exist" containerID="1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882" Feb 21 22:13:28 crc kubenswrapper[4717]: I0221 22:13:28.043429 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882"} err="failed to get container status \"1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882\": rpc error: code = NotFound desc = could not find container \"1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882\": container with ID starting with 1c050475a3d7978694161a04f96a6508dc833f0f85ee4c8c466c8ca707b6a882 not found: ID does not exist" Feb 21 22:13:28 crc kubenswrapper[4717]: I0221 22:13:28.043486 4717 scope.go:117] "RemoveContainer" containerID="58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673" Feb 21 22:13:28 crc kubenswrapper[4717]: E0221 22:13:28.044243 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673\": container with ID starting with 58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673 not found: ID does not exist" containerID="58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673" Feb 21 22:13:28 crc kubenswrapper[4717]: I0221 22:13:28.044295 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673"} err="failed to get container status \"58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673\": rpc error: code = NotFound desc = could not find container \"58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673\": container with ID starting with 58ca20ae537fc8eaaac50a5e69e46d0b8764debec6d6b0f5fc2d7c16a64ac673 not found: ID does not exist" Feb 21 22:13:28 crc kubenswrapper[4717]: I0221 22:13:28.044315 4717 scope.go:117] "RemoveContainer" containerID="4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9" Feb 21 22:13:28 crc kubenswrapper[4717]: E0221 22:13:28.044662 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9\": container with ID starting with 4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9 not found: ID does not exist" containerID="4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9" Feb 21 22:13:28 crc kubenswrapper[4717]: I0221 22:13:28.044725 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9"} err="failed to get container status \"4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9\": rpc error: code = NotFound desc = could not find container \"4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9\": container with ID starting with 4bcdf1d2ea3292f9a33f1af705eeceebe1a79ac094215a9b7bbc4b0c2704a8c9 not found: ID does not exist" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.015363 4717 scope.go:117] "RemoveContainer" containerID="7e68d9608689e85f550352bf2e011ef2171c7d13c1e06959c6078d29af852392" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.055849 4717 scope.go:117] "RemoveContainer" containerID="c8c590bffc5fc340f5d8a8a7611c6d7b82c628ffa8a5d49cfd7f3d19a1152bb0" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.142615 4717 scope.go:117] "RemoveContainer" containerID="73880c7cdb898cf7e7d9bcced9ef2f2fa003fe97c191d6135c470578c7ae90f6" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.175719 4717 scope.go:117] "RemoveContainer" containerID="138d13f715b4f6a86656eb38c67929a0c8ee1d111fc73febb47ebc73e54203fa" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.231706 4717 scope.go:117] "RemoveContainer" containerID="1712bb4801ea35250b75861058b9239da153385e4b26f1deed32e14c620ce1cc" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.280286 4717 scope.go:117] "RemoveContainer" containerID="5a4944f557f86e62b86f768738d5868dbe6540a06594c3111ea7858880ba2707" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.319228 4717 scope.go:117] "RemoveContainer" containerID="ef3cb12c3f3bed326a7e03e36648cd3104dcb58e7c559d206a74cd53af5fbee2" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.977683 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:13:29 crc kubenswrapper[4717]: E0221 22:13:29.978277 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:13:29 crc kubenswrapper[4717]: I0221 22:13:29.997186 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" path="/var/lib/kubelet/pods/65cb61db-ef1b-4173-9fa6-a79ebdc64d43/volumes" Feb 21 22:13:31 crc kubenswrapper[4717]: I0221 22:13:31.970032 4717 generic.go:334] "Generic (PLEG): container finished" podID="79339bc9-6d8a-4fe5-ba8d-37643afe6d98" containerID="a2a3067e1813f770d9054c4a587b2462798f6bd99f7bfeb58b708ca0ac23e7b3" exitCode=0 Feb 21 22:13:31 crc kubenswrapper[4717]: I0221 22:13:31.970089 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" event={"ID":"79339bc9-6d8a-4fe5-ba8d-37643afe6d98","Type":"ContainerDied","Data":"a2a3067e1813f770d9054c4a587b2462798f6bd99f7bfeb58b708ca0ac23e7b3"} Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.497674 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.675717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2znn\" (UniqueName: \"kubernetes.io/projected/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-kube-api-access-q2znn\") pod \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.676078 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-ssh-key-openstack-edpm-ipam\") pod \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.676199 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-inventory\") pod \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\" (UID: \"79339bc9-6d8a-4fe5-ba8d-37643afe6d98\") " Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.682436 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-kube-api-access-q2znn" (OuterVolumeSpecName: "kube-api-access-q2znn") pod "79339bc9-6d8a-4fe5-ba8d-37643afe6d98" (UID: "79339bc9-6d8a-4fe5-ba8d-37643afe6d98"). InnerVolumeSpecName "kube-api-access-q2znn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.717695 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "79339bc9-6d8a-4fe5-ba8d-37643afe6d98" (UID: "79339bc9-6d8a-4fe5-ba8d-37643afe6d98"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.721180 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-inventory" (OuterVolumeSpecName: "inventory") pod "79339bc9-6d8a-4fe5-ba8d-37643afe6d98" (UID: "79339bc9-6d8a-4fe5-ba8d-37643afe6d98"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.779671 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.779723 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:33 crc kubenswrapper[4717]: I0221 22:13:33.779741 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2znn\" (UniqueName: \"kubernetes.io/projected/79339bc9-6d8a-4fe5-ba8d-37643afe6d98-kube-api-access-q2znn\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.003446 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" event={"ID":"79339bc9-6d8a-4fe5-ba8d-37643afe6d98","Type":"ContainerDied","Data":"deeb190849fa3d73040e1be4616c1cfb18b1e0a1aa4e0446b4d6bc9011a7342e"} Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.003530 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deeb190849fa3d73040e1be4616c1cfb18b1e0a1aa4e0446b4d6bc9011a7342e" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.003536 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.128846 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx"] Feb 21 22:13:34 crc kubenswrapper[4717]: E0221 22:13:34.129300 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79339bc9-6d8a-4fe5-ba8d-37643afe6d98" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.129323 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="79339bc9-6d8a-4fe5-ba8d-37643afe6d98" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 21 22:13:34 crc kubenswrapper[4717]: E0221 22:13:34.129353 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="registry-server" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.129360 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="registry-server" Feb 21 22:13:34 crc kubenswrapper[4717]: E0221 22:13:34.129370 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="extract-content" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.129376 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="extract-content" Feb 21 22:13:34 crc kubenswrapper[4717]: E0221 22:13:34.129395 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="extract-utilities" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.129401 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="extract-utilities" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.129587 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="79339bc9-6d8a-4fe5-ba8d-37643afe6d98" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.129602 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="65cb61db-ef1b-4173-9fa6-a79ebdc64d43" containerName="registry-server" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.130188 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.135194 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.135972 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.136212 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.136698 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.154124 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx"] Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.191286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.191368 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8ff\" (UniqueName: \"kubernetes.io/projected/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-kube-api-access-5t8ff\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.191395 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.293239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.293410 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8ff\" (UniqueName: \"kubernetes.io/projected/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-kube-api-access-5t8ff\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.293458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.298957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.310736 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.316561 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8ff\" (UniqueName: \"kubernetes.io/projected/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-kube-api-access-5t8ff\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:34 crc kubenswrapper[4717]: I0221 22:13:34.492948 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:35 crc kubenswrapper[4717]: I0221 22:13:35.045392 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx"] Feb 21 22:13:36 crc kubenswrapper[4717]: I0221 22:13:36.028208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" event={"ID":"2eefd54d-a632-4fd6-a45c-e12e1f810d4a","Type":"ContainerStarted","Data":"9d69bb58893af2a66ad7d6befd3b38114b5cb1e44588a9a1be57ebfbd1a298c7"} Feb 21 22:13:36 crc kubenswrapper[4717]: I0221 22:13:36.028652 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" event={"ID":"2eefd54d-a632-4fd6-a45c-e12e1f810d4a","Type":"ContainerStarted","Data":"75243df9b77617a59fda90c3a302521d7d229a2161e675073d3e4c2edb0b9fc2"} Feb 21 22:13:36 crc kubenswrapper[4717]: I0221 22:13:36.053777 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" podStartSLOduration=1.54891602 podStartE2EDuration="2.053752908s" podCreationTimestamp="2026-02-21 22:13:34 +0000 UTC" firstStartedPulling="2026-02-21 22:13:35.04517612 +0000 UTC m=+1629.826709772" lastFinishedPulling="2026-02-21 22:13:35.550013038 +0000 UTC m=+1630.331546660" observedRunningTime="2026-02-21 22:13:36.042710685 +0000 UTC m=+1630.824244317" watchObservedRunningTime="2026-02-21 22:13:36.053752908 +0000 UTC m=+1630.835286540" Feb 21 22:13:41 crc kubenswrapper[4717]: I0221 22:13:41.088537 4717 generic.go:334] "Generic (PLEG): container finished" podID="2eefd54d-a632-4fd6-a45c-e12e1f810d4a" containerID="9d69bb58893af2a66ad7d6befd3b38114b5cb1e44588a9a1be57ebfbd1a298c7" exitCode=0 Feb 21 22:13:41 crc kubenswrapper[4717]: I0221 22:13:41.088633 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" event={"ID":"2eefd54d-a632-4fd6-a45c-e12e1f810d4a","Type":"ContainerDied","Data":"9d69bb58893af2a66ad7d6befd3b38114b5cb1e44588a9a1be57ebfbd1a298c7"} Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.555907 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.677912 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t8ff\" (UniqueName: \"kubernetes.io/projected/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-kube-api-access-5t8ff\") pod \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.678067 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-ssh-key-openstack-edpm-ipam\") pod \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.678143 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-inventory\") pod \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\" (UID: \"2eefd54d-a632-4fd6-a45c-e12e1f810d4a\") " Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.687953 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-kube-api-access-5t8ff" (OuterVolumeSpecName: "kube-api-access-5t8ff") pod "2eefd54d-a632-4fd6-a45c-e12e1f810d4a" (UID: "2eefd54d-a632-4fd6-a45c-e12e1f810d4a"). InnerVolumeSpecName "kube-api-access-5t8ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.713063 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2eefd54d-a632-4fd6-a45c-e12e1f810d4a" (UID: "2eefd54d-a632-4fd6-a45c-e12e1f810d4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.717494 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-inventory" (OuterVolumeSpecName: "inventory") pod "2eefd54d-a632-4fd6-a45c-e12e1f810d4a" (UID: "2eefd54d-a632-4fd6-a45c-e12e1f810d4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.781583 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.781852 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t8ff\" (UniqueName: \"kubernetes.io/projected/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-kube-api-access-5t8ff\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.782015 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2eefd54d-a632-4fd6-a45c-e12e1f810d4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:13:42 crc kubenswrapper[4717]: I0221 22:13:42.976078 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:13:42 crc kubenswrapper[4717]: E0221 22:13:42.976846 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.112696 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" event={"ID":"2eefd54d-a632-4fd6-a45c-e12e1f810d4a","Type":"ContainerDied","Data":"75243df9b77617a59fda90c3a302521d7d229a2161e675073d3e4c2edb0b9fc2"} Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.112754 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75243df9b77617a59fda90c3a302521d7d229a2161e675073d3e4c2edb0b9fc2" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.112823 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.241901 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s"] Feb 21 22:13:43 crc kubenswrapper[4717]: E0221 22:13:43.242451 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eefd54d-a632-4fd6-a45c-e12e1f810d4a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.242474 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eefd54d-a632-4fd6-a45c-e12e1f810d4a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.242698 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eefd54d-a632-4fd6-a45c-e12e1f810d4a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.243502 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.247232 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.247749 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.248997 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.249048 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.271649 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s"] Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.395824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.395947 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph5hd\" (UniqueName: \"kubernetes.io/projected/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-kube-api-access-ph5hd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.396312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.499461 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.499552 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.499605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph5hd\" (UniqueName: \"kubernetes.io/projected/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-kube-api-access-ph5hd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.505195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.509595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.538357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph5hd\" (UniqueName: \"kubernetes.io/projected/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-kube-api-access-ph5hd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ml46s\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:43 crc kubenswrapper[4717]: I0221 22:13:43.577414 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:13:44 crc kubenswrapper[4717]: I0221 22:13:44.202935 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s"] Feb 21 22:13:45 crc kubenswrapper[4717]: I0221 22:13:45.137801 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" event={"ID":"0fcb6379-1d8c-44c9-8e50-10c52e40abcd","Type":"ContainerStarted","Data":"2ddc62d2a3aeae834a88bae8419ba386888c7d7944057bcba049fda53f0a273c"} Feb 21 22:13:45 crc kubenswrapper[4717]: I0221 22:13:45.138309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" event={"ID":"0fcb6379-1d8c-44c9-8e50-10c52e40abcd","Type":"ContainerStarted","Data":"ecdd43ab0a0fbe8657d88845c9f48b7e3b34eea898748cc6128efe27e7022600"} Feb 21 22:13:45 crc kubenswrapper[4717]: I0221 22:13:45.170504 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" podStartSLOduration=1.746911303 podStartE2EDuration="2.170472811s" podCreationTimestamp="2026-02-21 22:13:43 +0000 UTC" firstStartedPulling="2026-02-21 22:13:44.203567948 +0000 UTC m=+1638.985101580" lastFinishedPulling="2026-02-21 22:13:44.627129466 +0000 UTC m=+1639.408663088" observedRunningTime="2026-02-21 22:13:45.157942902 +0000 UTC m=+1639.939476524" watchObservedRunningTime="2026-02-21 22:13:45.170472811 +0000 UTC m=+1639.952006473" Feb 21 22:13:57 crc kubenswrapper[4717]: I0221 22:13:57.977653 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:13:57 crc kubenswrapper[4717]: E0221 22:13:57.978685 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.074139 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b797-account-create-update-mfrb6"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.087709 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vvzkx"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.100517 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-dbfd-account-create-update-lhmrn"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.109930 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gvjrw"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.119423 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wl5ms"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.127383 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b797-account-create-update-mfrb6"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.134688 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vvzkx"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.141159 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1f48-account-create-update-wjdg5"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.147207 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gvjrw"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.153687 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wl5ms"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.160099 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1f48-account-create-update-wjdg5"] Feb 21 22:14:05 crc kubenswrapper[4717]: I0221 22:14:05.166200 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-dbfd-account-create-update-lhmrn"] Feb 21 22:14:06 crc kubenswrapper[4717]: I0221 22:14:06.004194 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dfd7221-c036-41a9-be83-4b0d9d1fac4b" path="/var/lib/kubelet/pods/4dfd7221-c036-41a9-be83-4b0d9d1fac4b/volumes" Feb 21 22:14:06 crc kubenswrapper[4717]: I0221 22:14:06.005663 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d34ce70-aef8-44bc-872a-be96892f145f" path="/var/lib/kubelet/pods/7d34ce70-aef8-44bc-872a-be96892f145f/volumes" Feb 21 22:14:06 crc kubenswrapper[4717]: I0221 22:14:06.006483 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ecbe558-f60c-4be1-8dbb-35110ba64185" path="/var/lib/kubelet/pods/8ecbe558-f60c-4be1-8dbb-35110ba64185/volumes" Feb 21 22:14:06 crc kubenswrapper[4717]: I0221 22:14:06.007981 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9186937b-6e90-45ec-9494-aef66cdfe28b" path="/var/lib/kubelet/pods/9186937b-6e90-45ec-9494-aef66cdfe28b/volumes" Feb 21 22:14:06 crc kubenswrapper[4717]: I0221 22:14:06.013416 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d430f5c-b58e-4b87-9ff4-391c6d796215" path="/var/lib/kubelet/pods/9d430f5c-b58e-4b87-9ff4-391c6d796215/volumes" Feb 21 22:14:06 crc kubenswrapper[4717]: I0221 22:14:06.014442 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2afd36f-b3bb-4232-8b66-2cc64cf53f6a" path="/var/lib/kubelet/pods/d2afd36f-b3bb-4232-8b66-2cc64cf53f6a/volumes" Feb 21 22:14:09 crc kubenswrapper[4717]: I0221 22:14:09.976280 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:14:09 crc kubenswrapper[4717]: E0221 22:14:09.976942 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:14:22 crc kubenswrapper[4717]: I0221 22:14:22.902223 4717 generic.go:334] "Generic (PLEG): container finished" podID="0fcb6379-1d8c-44c9-8e50-10c52e40abcd" containerID="2ddc62d2a3aeae834a88bae8419ba386888c7d7944057bcba049fda53f0a273c" exitCode=0 Feb 21 22:14:22 crc kubenswrapper[4717]: I0221 22:14:22.902404 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" event={"ID":"0fcb6379-1d8c-44c9-8e50-10c52e40abcd","Type":"ContainerDied","Data":"2ddc62d2a3aeae834a88bae8419ba386888c7d7944057bcba049fda53f0a273c"} Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.397104 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.608278 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-inventory\") pod \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.608385 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph5hd\" (UniqueName: \"kubernetes.io/projected/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-kube-api-access-ph5hd\") pod \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.608532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-ssh-key-openstack-edpm-ipam\") pod \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\" (UID: \"0fcb6379-1d8c-44c9-8e50-10c52e40abcd\") " Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.614252 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-kube-api-access-ph5hd" (OuterVolumeSpecName: "kube-api-access-ph5hd") pod "0fcb6379-1d8c-44c9-8e50-10c52e40abcd" (UID: "0fcb6379-1d8c-44c9-8e50-10c52e40abcd"). InnerVolumeSpecName "kube-api-access-ph5hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.639750 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0fcb6379-1d8c-44c9-8e50-10c52e40abcd" (UID: "0fcb6379-1d8c-44c9-8e50-10c52e40abcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.643055 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-inventory" (OuterVolumeSpecName: "inventory") pod "0fcb6379-1d8c-44c9-8e50-10c52e40abcd" (UID: "0fcb6379-1d8c-44c9-8e50-10c52e40abcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.710681 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.710734 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph5hd\" (UniqueName: \"kubernetes.io/projected/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-kube-api-access-ph5hd\") on node \"crc\" DevicePath \"\"" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.710794 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fcb6379-1d8c-44c9-8e50-10c52e40abcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.926723 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" event={"ID":"0fcb6379-1d8c-44c9-8e50-10c52e40abcd","Type":"ContainerDied","Data":"ecdd43ab0a0fbe8657d88845c9f48b7e3b34eea898748cc6128efe27e7022600"} Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.926791 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecdd43ab0a0fbe8657d88845c9f48b7e3b34eea898748cc6128efe27e7022600" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.926829 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ml46s" Feb 21 22:14:24 crc kubenswrapper[4717]: I0221 22:14:24.977789 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:14:24 crc kubenswrapper[4717]: E0221 22:14:24.978329 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.038101 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66"] Feb 21 22:14:25 crc kubenswrapper[4717]: E0221 22:14:25.038632 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcb6379-1d8c-44c9-8e50-10c52e40abcd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.038654 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcb6379-1d8c-44c9-8e50-10c52e40abcd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.038946 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcb6379-1d8c-44c9-8e50-10c52e40abcd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.039711 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.042110 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.042590 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.042847 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.048115 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.048917 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66"] Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.220118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.220559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66c9q\" (UniqueName: \"kubernetes.io/projected/ce64ba17-432c-46d7-86f2-33cc62514604-kube-api-access-66c9q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.220615 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.322688 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.322848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66c9q\" (UniqueName: \"kubernetes.io/projected/ce64ba17-432c-46d7-86f2-33cc62514604-kube-api-access-66c9q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.322928 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.327767 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.331668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.344533 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66c9q\" (UniqueName: \"kubernetes.io/projected/ce64ba17-432c-46d7-86f2-33cc62514604-kube-api-access-66c9q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6hp66\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.361139 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:14:25 crc kubenswrapper[4717]: I0221 22:14:25.944182 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66"] Feb 21 22:14:25 crc kubenswrapper[4717]: W0221 22:14:25.945439 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce64ba17_432c_46d7_86f2_33cc62514604.slice/crio-1ef625ed8e90027690409a756d5722ab0d6e42125ee4eeb677db1e60861f3baa WatchSource:0}: Error finding container 1ef625ed8e90027690409a756d5722ab0d6e42125ee4eeb677db1e60861f3baa: Status 404 returned error can't find the container with id 1ef625ed8e90027690409a756d5722ab0d6e42125ee4eeb677db1e60861f3baa Feb 21 22:14:26 crc kubenswrapper[4717]: I0221 22:14:26.398056 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:14:26 crc kubenswrapper[4717]: I0221 22:14:26.958374 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" event={"ID":"ce64ba17-432c-46d7-86f2-33cc62514604","Type":"ContainerStarted","Data":"da8beff2400fff2859bcb61a6bb870a39bfbe663ac79c98e33543424c8b4a1d4"} Feb 21 22:14:26 crc kubenswrapper[4717]: I0221 22:14:26.958656 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" event={"ID":"ce64ba17-432c-46d7-86f2-33cc62514604","Type":"ContainerStarted","Data":"1ef625ed8e90027690409a756d5722ab0d6e42125ee4eeb677db1e60861f3baa"} Feb 21 22:14:26 crc kubenswrapper[4717]: I0221 22:14:26.979235 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" podStartSLOduration=1.533707296 podStartE2EDuration="1.979218037s" podCreationTimestamp="2026-02-21 22:14:25 +0000 UTC" firstStartedPulling="2026-02-21 22:14:25.947908208 +0000 UTC m=+1680.729441870" lastFinishedPulling="2026-02-21 22:14:26.393418979 +0000 UTC m=+1681.174952611" observedRunningTime="2026-02-21 22:14:26.976962664 +0000 UTC m=+1681.758496306" watchObservedRunningTime="2026-02-21 22:14:26.979218037 +0000 UTC m=+1681.760751669" Feb 21 22:14:29 crc kubenswrapper[4717]: I0221 22:14:29.523894 4717 scope.go:117] "RemoveContainer" containerID="fcd44a1187b50a99d3c545854bd8b3f477554d831ffa40d1532d971cb1dc03c8" Feb 21 22:14:29 crc kubenswrapper[4717]: I0221 22:14:29.565670 4717 scope.go:117] "RemoveContainer" containerID="d572024a8c5af8297462282be44724c6982034ea6de6810b1a38b9cb6a3d2b96" Feb 21 22:14:29 crc kubenswrapper[4717]: I0221 22:14:29.635942 4717 scope.go:117] "RemoveContainer" containerID="230f2a9545a84d795c8ddd65b06ca9b914295efb8d182ab283e3028179e3398d" Feb 21 22:14:29 crc kubenswrapper[4717]: I0221 22:14:29.685848 4717 scope.go:117] "RemoveContainer" containerID="1c20a06b0e7e5a9f33013e53a59c5d71c23b9e60b25e73ebee0f4d48c2ad3362" Feb 21 22:14:29 crc kubenswrapper[4717]: I0221 22:14:29.725276 4717 scope.go:117] "RemoveContainer" containerID="4e6667a53120acec0f2b5bc99ee5bea5984df76c459961e4c47d0fd7a9383d4c" Feb 21 22:14:29 crc kubenswrapper[4717]: I0221 22:14:29.777680 4717 scope.go:117] "RemoveContainer" containerID="4340e361658741eaff1dbdeeba057fca6c5da84ba60cbd0228467f7e3eab75f4" Feb 21 22:14:33 crc kubenswrapper[4717]: I0221 22:14:33.075772 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2fx6v"] Feb 21 22:14:33 crc kubenswrapper[4717]: I0221 22:14:33.089777 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2fx6v"] Feb 21 22:14:33 crc kubenswrapper[4717]: I0221 22:14:33.996129 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c98495-0073-4b91-a0bc-84a2e6f04a01" path="/var/lib/kubelet/pods/35c98495-0073-4b91-a0bc-84a2e6f04a01/volumes" Feb 21 22:14:39 crc kubenswrapper[4717]: I0221 22:14:39.982047 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:14:39 crc kubenswrapper[4717]: E0221 22:14:39.982965 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:14:51 crc kubenswrapper[4717]: I0221 22:14:51.067287 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6nknn"] Feb 21 22:14:51 crc kubenswrapper[4717]: I0221 22:14:51.081886 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6nknn"] Feb 21 22:14:51 crc kubenswrapper[4717]: I0221 22:14:51.997270 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a97228-6638-44d9-a311-70460be3479e" path="/var/lib/kubelet/pods/15a97228-6638-44d9-a311-70460be3479e/volumes" Feb 21 22:14:52 crc kubenswrapper[4717]: I0221 22:14:52.037738 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m5gsm"] Feb 21 22:14:52 crc kubenswrapper[4717]: I0221 22:14:52.048698 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m5gsm"] Feb 21 22:14:53 crc kubenswrapper[4717]: I0221 22:14:53.977105 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:14:53 crc kubenswrapper[4717]: E0221 22:14:53.977942 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:14:53 crc kubenswrapper[4717]: I0221 22:14:53.996955 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c514ae-3636-4054-a900-61fb5fe5c598" path="/var/lib/kubelet/pods/40c514ae-3636-4054-a900-61fb5fe5c598/volumes" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.163709 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx"] Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.165544 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.168163 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.173617 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.200346 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx"] Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.330661 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23e37b1c-c564-490c-8f7e-92dd601a6451-secret-volume\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.331031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23e37b1c-c564-490c-8f7e-92dd601a6451-config-volume\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.331187 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j899\" (UniqueName: \"kubernetes.io/projected/23e37b1c-c564-490c-8f7e-92dd601a6451-kube-api-access-9j899\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.432674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23e37b1c-c564-490c-8f7e-92dd601a6451-secret-volume\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.432807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23e37b1c-c564-490c-8f7e-92dd601a6451-config-volume\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.432878 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j899\" (UniqueName: \"kubernetes.io/projected/23e37b1c-c564-490c-8f7e-92dd601a6451-kube-api-access-9j899\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.434362 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23e37b1c-c564-490c-8f7e-92dd601a6451-config-volume\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.444087 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23e37b1c-c564-490c-8f7e-92dd601a6451-secret-volume\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.449128 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j899\" (UniqueName: \"kubernetes.io/projected/23e37b1c-c564-490c-8f7e-92dd601a6451-kube-api-access-9j899\") pod \"collect-profiles-29528535-bp9xx\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.505932 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:00 crc kubenswrapper[4717]: I0221 22:15:00.984582 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx"] Feb 21 22:15:01 crc kubenswrapper[4717]: I0221 22:15:01.314288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" event={"ID":"23e37b1c-c564-490c-8f7e-92dd601a6451","Type":"ContainerStarted","Data":"ca6fc13b296092d0fba0a4f00472161cccf1667f74dfe9defb77604349b0d277"} Feb 21 22:15:01 crc kubenswrapper[4717]: I0221 22:15:01.314630 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" event={"ID":"23e37b1c-c564-490c-8f7e-92dd601a6451","Type":"ContainerStarted","Data":"1e0b959428f05f6449329590609d082393b1fc359dc816d240625add533c1db0"} Feb 21 22:15:01 crc kubenswrapper[4717]: E0221 22:15:01.544154 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e37b1c_c564_490c_8f7e_92dd601a6451.slice/crio-conmon-ca6fc13b296092d0fba0a4f00472161cccf1667f74dfe9defb77604349b0d277.scope\": RecentStats: unable to find data in memory cache]" Feb 21 22:15:02 crc kubenswrapper[4717]: I0221 22:15:02.327971 4717 generic.go:334] "Generic (PLEG): container finished" podID="23e37b1c-c564-490c-8f7e-92dd601a6451" containerID="ca6fc13b296092d0fba0a4f00472161cccf1667f74dfe9defb77604349b0d277" exitCode=0 Feb 21 22:15:02 crc kubenswrapper[4717]: I0221 22:15:02.328050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" event={"ID":"23e37b1c-c564-490c-8f7e-92dd601a6451","Type":"ContainerDied","Data":"ca6fc13b296092d0fba0a4f00472161cccf1667f74dfe9defb77604349b0d277"} Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.736501 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.937590 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23e37b1c-c564-490c-8f7e-92dd601a6451-secret-volume\") pod \"23e37b1c-c564-490c-8f7e-92dd601a6451\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.937711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j899\" (UniqueName: \"kubernetes.io/projected/23e37b1c-c564-490c-8f7e-92dd601a6451-kube-api-access-9j899\") pod \"23e37b1c-c564-490c-8f7e-92dd601a6451\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.937829 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23e37b1c-c564-490c-8f7e-92dd601a6451-config-volume\") pod \"23e37b1c-c564-490c-8f7e-92dd601a6451\" (UID: \"23e37b1c-c564-490c-8f7e-92dd601a6451\") " Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.938686 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e37b1c-c564-490c-8f7e-92dd601a6451-config-volume" (OuterVolumeSpecName: "config-volume") pod "23e37b1c-c564-490c-8f7e-92dd601a6451" (UID: "23e37b1c-c564-490c-8f7e-92dd601a6451"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.946285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e37b1c-c564-490c-8f7e-92dd601a6451-kube-api-access-9j899" (OuterVolumeSpecName: "kube-api-access-9j899") pod "23e37b1c-c564-490c-8f7e-92dd601a6451" (UID: "23e37b1c-c564-490c-8f7e-92dd601a6451"). InnerVolumeSpecName "kube-api-access-9j899". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:15:03 crc kubenswrapper[4717]: I0221 22:15:03.948520 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e37b1c-c564-490c-8f7e-92dd601a6451-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23e37b1c-c564-490c-8f7e-92dd601a6451" (UID: "23e37b1c-c564-490c-8f7e-92dd601a6451"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:04 crc kubenswrapper[4717]: I0221 22:15:04.042105 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j899\" (UniqueName: \"kubernetes.io/projected/23e37b1c-c564-490c-8f7e-92dd601a6451-kube-api-access-9j899\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:04 crc kubenswrapper[4717]: I0221 22:15:04.042167 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23e37b1c-c564-490c-8f7e-92dd601a6451-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:04 crc kubenswrapper[4717]: I0221 22:15:04.042188 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23e37b1c-c564-490c-8f7e-92dd601a6451-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:04 crc kubenswrapper[4717]: I0221 22:15:04.351214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" event={"ID":"23e37b1c-c564-490c-8f7e-92dd601a6451","Type":"ContainerDied","Data":"1e0b959428f05f6449329590609d082393b1fc359dc816d240625add533c1db0"} Feb 21 22:15:04 crc kubenswrapper[4717]: I0221 22:15:04.351520 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0b959428f05f6449329590609d082393b1fc359dc816d240625add533c1db0" Feb 21 22:15:04 crc kubenswrapper[4717]: I0221 22:15:04.351295 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528535-bp9xx" Feb 21 22:15:08 crc kubenswrapper[4717]: I0221 22:15:08.977056 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:15:08 crc kubenswrapper[4717]: E0221 22:15:08.978290 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:15:18 crc kubenswrapper[4717]: I0221 22:15:18.526717 4717 generic.go:334] "Generic (PLEG): container finished" podID="ce64ba17-432c-46d7-86f2-33cc62514604" containerID="da8beff2400fff2859bcb61a6bb870a39bfbe663ac79c98e33543424c8b4a1d4" exitCode=0 Feb 21 22:15:18 crc kubenswrapper[4717]: I0221 22:15:18.526782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" event={"ID":"ce64ba17-432c-46d7-86f2-33cc62514604","Type":"ContainerDied","Data":"da8beff2400fff2859bcb61a6bb870a39bfbe663ac79c98e33543424c8b4a1d4"} Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.007880 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.120256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-ssh-key-openstack-edpm-ipam\") pod \"ce64ba17-432c-46d7-86f2-33cc62514604\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.120331 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-inventory\") pod \"ce64ba17-432c-46d7-86f2-33cc62514604\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.120503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66c9q\" (UniqueName: \"kubernetes.io/projected/ce64ba17-432c-46d7-86f2-33cc62514604-kube-api-access-66c9q\") pod \"ce64ba17-432c-46d7-86f2-33cc62514604\" (UID: \"ce64ba17-432c-46d7-86f2-33cc62514604\") " Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.128565 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce64ba17-432c-46d7-86f2-33cc62514604-kube-api-access-66c9q" (OuterVolumeSpecName: "kube-api-access-66c9q") pod "ce64ba17-432c-46d7-86f2-33cc62514604" (UID: "ce64ba17-432c-46d7-86f2-33cc62514604"). InnerVolumeSpecName "kube-api-access-66c9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.153164 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-inventory" (OuterVolumeSpecName: "inventory") pod "ce64ba17-432c-46d7-86f2-33cc62514604" (UID: "ce64ba17-432c-46d7-86f2-33cc62514604"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.157250 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ce64ba17-432c-46d7-86f2-33cc62514604" (UID: "ce64ba17-432c-46d7-86f2-33cc62514604"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.222801 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.222852 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce64ba17-432c-46d7-86f2-33cc62514604-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.222896 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66c9q\" (UniqueName: \"kubernetes.io/projected/ce64ba17-432c-46d7-86f2-33cc62514604-kube-api-access-66c9q\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.546800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" event={"ID":"ce64ba17-432c-46d7-86f2-33cc62514604","Type":"ContainerDied","Data":"1ef625ed8e90027690409a756d5722ab0d6e42125ee4eeb677db1e60861f3baa"} Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.547429 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ef625ed8e90027690409a756d5722ab0d6e42125ee4eeb677db1e60861f3baa" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.546909 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6hp66" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.648217 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gshck"] Feb 21 22:15:20 crc kubenswrapper[4717]: E0221 22:15:20.648662 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e37b1c-c564-490c-8f7e-92dd601a6451" containerName="collect-profiles" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.648688 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e37b1c-c564-490c-8f7e-92dd601a6451" containerName="collect-profiles" Feb 21 22:15:20 crc kubenswrapper[4717]: E0221 22:15:20.648725 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce64ba17-432c-46d7-86f2-33cc62514604" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.648736 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce64ba17-432c-46d7-86f2-33cc62514604" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.648963 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce64ba17-432c-46d7-86f2-33cc62514604" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.648987 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e37b1c-c564-490c-8f7e-92dd601a6451" containerName="collect-profiles" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.649781 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.651848 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.652764 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.652977 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.655031 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.659132 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gshck"] Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.832833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.832940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.832989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4cdp\" (UniqueName: \"kubernetes.io/projected/dee1c8dd-766e-41de-8631-cfc7d23a7681-kube-api-access-w4cdp\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.935214 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.935652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4cdp\" (UniqueName: \"kubernetes.io/projected/dee1c8dd-766e-41de-8631-cfc7d23a7681-kube-api-access-w4cdp\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.935971 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.943594 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.946091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:20 crc kubenswrapper[4717]: I0221 22:15:20.965814 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4cdp\" (UniqueName: \"kubernetes.io/projected/dee1c8dd-766e-41de-8631-cfc7d23a7681-kube-api-access-w4cdp\") pod \"ssh-known-hosts-edpm-deployment-gshck\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:21 crc kubenswrapper[4717]: I0221 22:15:21.012550 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:21 crc kubenswrapper[4717]: I0221 22:15:21.342495 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gshck"] Feb 21 22:15:21 crc kubenswrapper[4717]: I0221 22:15:21.557011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" event={"ID":"dee1c8dd-766e-41de-8631-cfc7d23a7681","Type":"ContainerStarted","Data":"f4a97f804720b1c528466ca4efc9efc3b4d77ad584dbd89c86acf74c7eaaaa93"} Feb 21 22:15:22 crc kubenswrapper[4717]: I0221 22:15:22.568163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" event={"ID":"dee1c8dd-766e-41de-8631-cfc7d23a7681","Type":"ContainerStarted","Data":"347100f7fe23da4ddd69922079337ca018d849b5e982dfa16a6df52df6fe3937"} Feb 21 22:15:22 crc kubenswrapper[4717]: I0221 22:15:22.601264 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" podStartSLOduration=2.071838098 podStartE2EDuration="2.601242407s" podCreationTimestamp="2026-02-21 22:15:20 +0000 UTC" firstStartedPulling="2026-02-21 22:15:21.35406568 +0000 UTC m=+1736.135599292" lastFinishedPulling="2026-02-21 22:15:21.883469979 +0000 UTC m=+1736.665003601" observedRunningTime="2026-02-21 22:15:22.591355159 +0000 UTC m=+1737.372888831" watchObservedRunningTime="2026-02-21 22:15:22.601242407 +0000 UTC m=+1737.382776039" Feb 21 22:15:23 crc kubenswrapper[4717]: I0221 22:15:23.976546 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:15:23 crc kubenswrapper[4717]: E0221 22:15:23.976807 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:15:29 crc kubenswrapper[4717]: I0221 22:15:29.640809 4717 generic.go:334] "Generic (PLEG): container finished" podID="dee1c8dd-766e-41de-8631-cfc7d23a7681" containerID="347100f7fe23da4ddd69922079337ca018d849b5e982dfa16a6df52df6fe3937" exitCode=0 Feb 21 22:15:29 crc kubenswrapper[4717]: I0221 22:15:29.641623 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" event={"ID":"dee1c8dd-766e-41de-8631-cfc7d23a7681","Type":"ContainerDied","Data":"347100f7fe23da4ddd69922079337ca018d849b5e982dfa16a6df52df6fe3937"} Feb 21 22:15:29 crc kubenswrapper[4717]: I0221 22:15:29.914279 4717 scope.go:117] "RemoveContainer" containerID="ae81b976f47fcde895a06f4144759381230a0bfd42120e50886b7ce1da6d3bf2" Feb 21 22:15:30 crc kubenswrapper[4717]: I0221 22:15:30.001441 4717 scope.go:117] "RemoveContainer" containerID="d0c5a4385227f76979b13e9221d9bcfee079a7bdc6efd002cbaf270f65e2f854" Feb 21 22:15:30 crc kubenswrapper[4717]: I0221 22:15:30.083183 4717 scope.go:117] "RemoveContainer" containerID="6ee8e5590e0a055de4065fbd758f2f6ea37d66a5c10a9d121772448ea0557df2" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.097577 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.263061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4cdp\" (UniqueName: \"kubernetes.io/projected/dee1c8dd-766e-41de-8631-cfc7d23a7681-kube-api-access-w4cdp\") pod \"dee1c8dd-766e-41de-8631-cfc7d23a7681\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.263149 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-ssh-key-openstack-edpm-ipam\") pod \"dee1c8dd-766e-41de-8631-cfc7d23a7681\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.263177 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-inventory-0\") pod \"dee1c8dd-766e-41de-8631-cfc7d23a7681\" (UID: \"dee1c8dd-766e-41de-8631-cfc7d23a7681\") " Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.272719 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee1c8dd-766e-41de-8631-cfc7d23a7681-kube-api-access-w4cdp" (OuterVolumeSpecName: "kube-api-access-w4cdp") pod "dee1c8dd-766e-41de-8631-cfc7d23a7681" (UID: "dee1c8dd-766e-41de-8631-cfc7d23a7681"). InnerVolumeSpecName "kube-api-access-w4cdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.295703 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dee1c8dd-766e-41de-8631-cfc7d23a7681" (UID: "dee1c8dd-766e-41de-8631-cfc7d23a7681"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.308129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "dee1c8dd-766e-41de-8631-cfc7d23a7681" (UID: "dee1c8dd-766e-41de-8631-cfc7d23a7681"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.365534 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4cdp\" (UniqueName: \"kubernetes.io/projected/dee1c8dd-766e-41de-8631-cfc7d23a7681-kube-api-access-w4cdp\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.365572 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.365582 4717 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dee1c8dd-766e-41de-8631-cfc7d23a7681-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.664815 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" event={"ID":"dee1c8dd-766e-41de-8631-cfc7d23a7681","Type":"ContainerDied","Data":"f4a97f804720b1c528466ca4efc9efc3b4d77ad584dbd89c86acf74c7eaaaa93"} Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.664856 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a97f804720b1c528466ca4efc9efc3b4d77ad584dbd89c86acf74c7eaaaa93" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.664887 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gshck" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.750440 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp"] Feb 21 22:15:31 crc kubenswrapper[4717]: E0221 22:15:31.750991 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee1c8dd-766e-41de-8631-cfc7d23a7681" containerName="ssh-known-hosts-edpm-deployment" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.751024 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee1c8dd-766e-41de-8631-cfc7d23a7681" containerName="ssh-known-hosts-edpm-deployment" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.751356 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee1c8dd-766e-41de-8631-cfc7d23a7681" containerName="ssh-known-hosts-edpm-deployment" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.754420 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.756987 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.757147 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.760431 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp"] Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.763248 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.763340 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.879082 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.879151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ksl\" (UniqueName: \"kubernetes.io/projected/d1311e18-5b26-49a3-86e3-481b3f9e5b03-kube-api-access-98ksl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.879233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.980548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98ksl\" (UniqueName: \"kubernetes.io/projected/d1311e18-5b26-49a3-86e3-481b3f9e5b03-kube-api-access-98ksl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.981049 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.981300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.987637 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:31 crc kubenswrapper[4717]: I0221 22:15:31.988124 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:32 crc kubenswrapper[4717]: I0221 22:15:32.002936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ksl\" (UniqueName: \"kubernetes.io/projected/d1311e18-5b26-49a3-86e3-481b3f9e5b03-kube-api-access-98ksl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pslvp\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:32 crc kubenswrapper[4717]: I0221 22:15:32.079674 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:32 crc kubenswrapper[4717]: I0221 22:15:32.650994 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp"] Feb 21 22:15:32 crc kubenswrapper[4717]: W0221 22:15:32.661044 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1311e18_5b26_49a3_86e3_481b3f9e5b03.slice/crio-7cf21b4efde7ec3ab705258f7c5d6190e351e9f9ac41b1b19903ce841395c7a3 WatchSource:0}: Error finding container 7cf21b4efde7ec3ab705258f7c5d6190e351e9f9ac41b1b19903ce841395c7a3: Status 404 returned error can't find the container with id 7cf21b4efde7ec3ab705258f7c5d6190e351e9f9ac41b1b19903ce841395c7a3 Feb 21 22:15:32 crc kubenswrapper[4717]: I0221 22:15:32.683692 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" event={"ID":"d1311e18-5b26-49a3-86e3-481b3f9e5b03","Type":"ContainerStarted","Data":"7cf21b4efde7ec3ab705258f7c5d6190e351e9f9ac41b1b19903ce841395c7a3"} Feb 21 22:15:33 crc kubenswrapper[4717]: I0221 22:15:33.693799 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" event={"ID":"d1311e18-5b26-49a3-86e3-481b3f9e5b03","Type":"ContainerStarted","Data":"95c90e1c4f9160d27aa564ce2cde3c89c023005283e942421ded649e5f804075"} Feb 21 22:15:36 crc kubenswrapper[4717]: I0221 22:15:36.058160 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" podStartSLOduration=4.652739331 podStartE2EDuration="5.058139764s" podCreationTimestamp="2026-02-21 22:15:31 +0000 UTC" firstStartedPulling="2026-02-21 22:15:32.664130138 +0000 UTC m=+1747.445663800" lastFinishedPulling="2026-02-21 22:15:33.069530601 +0000 UTC m=+1747.851064233" observedRunningTime="2026-02-21 22:15:33.717307528 +0000 UTC m=+1748.498841160" watchObservedRunningTime="2026-02-21 22:15:36.058139764 +0000 UTC m=+1750.839673396" Feb 21 22:15:36 crc kubenswrapper[4717]: I0221 22:15:36.061068 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vqz8f"] Feb 21 22:15:36 crc kubenswrapper[4717]: I0221 22:15:36.075602 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vqz8f"] Feb 21 22:15:37 crc kubenswrapper[4717]: I0221 22:15:37.976471 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:15:37 crc kubenswrapper[4717]: E0221 22:15:37.976838 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:15:37 crc kubenswrapper[4717]: I0221 22:15:37.989914 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed51d19d-c6f1-427b-b1de-d7e1debf9870" path="/var/lib/kubelet/pods/ed51d19d-c6f1-427b-b1de-d7e1debf9870/volumes" Feb 21 22:15:41 crc kubenswrapper[4717]: I0221 22:15:41.789708 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1311e18-5b26-49a3-86e3-481b3f9e5b03" containerID="95c90e1c4f9160d27aa564ce2cde3c89c023005283e942421ded649e5f804075" exitCode=0 Feb 21 22:15:41 crc kubenswrapper[4717]: I0221 22:15:41.789818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" event={"ID":"d1311e18-5b26-49a3-86e3-481b3f9e5b03","Type":"ContainerDied","Data":"95c90e1c4f9160d27aa564ce2cde3c89c023005283e942421ded649e5f804075"} Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.328134 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.342428 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-ssh-key-openstack-edpm-ipam\") pod \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.342728 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-inventory\") pod \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.342933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98ksl\" (UniqueName: \"kubernetes.io/projected/d1311e18-5b26-49a3-86e3-481b3f9e5b03-kube-api-access-98ksl\") pod \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\" (UID: \"d1311e18-5b26-49a3-86e3-481b3f9e5b03\") " Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.364074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1311e18-5b26-49a3-86e3-481b3f9e5b03-kube-api-access-98ksl" (OuterVolumeSpecName: "kube-api-access-98ksl") pod "d1311e18-5b26-49a3-86e3-481b3f9e5b03" (UID: "d1311e18-5b26-49a3-86e3-481b3f9e5b03"). InnerVolumeSpecName "kube-api-access-98ksl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.417041 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-inventory" (OuterVolumeSpecName: "inventory") pod "d1311e18-5b26-49a3-86e3-481b3f9e5b03" (UID: "d1311e18-5b26-49a3-86e3-481b3f9e5b03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.433252 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d1311e18-5b26-49a3-86e3-481b3f9e5b03" (UID: "d1311e18-5b26-49a3-86e3-481b3f9e5b03"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.446104 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.446142 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1311e18-5b26-49a3-86e3-481b3f9e5b03-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.446151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98ksl\" (UniqueName: \"kubernetes.io/projected/d1311e18-5b26-49a3-86e3-481b3f9e5b03-kube-api-access-98ksl\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.813618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" event={"ID":"d1311e18-5b26-49a3-86e3-481b3f9e5b03","Type":"ContainerDied","Data":"7cf21b4efde7ec3ab705258f7c5d6190e351e9f9ac41b1b19903ce841395c7a3"} Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.814071 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cf21b4efde7ec3ab705258f7c5d6190e351e9f9ac41b1b19903ce841395c7a3" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.813727 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pslvp" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.921295 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5"] Feb 21 22:15:43 crc kubenswrapper[4717]: E0221 22:15:43.924591 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1311e18-5b26-49a3-86e3-481b3f9e5b03" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.924633 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1311e18-5b26-49a3-86e3-481b3f9e5b03" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.925015 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1311e18-5b26-49a3-86e3-481b3f9e5b03" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.926274 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.929086 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.929588 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.929688 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.930896 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.951829 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5"] Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.956815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52t4g\" (UniqueName: \"kubernetes.io/projected/849e3b78-1693-47ad-9fc7-63c7a188d53e-kube-api-access-52t4g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.957093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:43 crc kubenswrapper[4717]: I0221 22:15:43.957222 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.059632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.059806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52t4g\" (UniqueName: \"kubernetes.io/projected/849e3b78-1693-47ad-9fc7-63c7a188d53e-kube-api-access-52t4g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.059853 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.064297 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.068347 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.082718 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52t4g\" (UniqueName: \"kubernetes.io/projected/849e3b78-1693-47ad-9fc7-63c7a188d53e-kube-api-access-52t4g\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.244607 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.796333 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5"] Feb 21 22:15:44 crc kubenswrapper[4717]: I0221 22:15:44.823306 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" event={"ID":"849e3b78-1693-47ad-9fc7-63c7a188d53e","Type":"ContainerStarted","Data":"95081e7f415ee44950c3ef3450a772b9e1516180d1e400d245a2801fe24aad1b"} Feb 21 22:15:45 crc kubenswrapper[4717]: I0221 22:15:45.836064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" event={"ID":"849e3b78-1693-47ad-9fc7-63c7a188d53e","Type":"ContainerStarted","Data":"6ea996d12f3e04f27467f41df2717fe67d7c6936faa310e59d85a62f2e00dae2"} Feb 21 22:15:45 crc kubenswrapper[4717]: I0221 22:15:45.859176 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" podStartSLOduration=2.4350284220000002 podStartE2EDuration="2.859154218s" podCreationTimestamp="2026-02-21 22:15:43 +0000 UTC" firstStartedPulling="2026-02-21 22:15:44.80857055 +0000 UTC m=+1759.590104172" lastFinishedPulling="2026-02-21 22:15:45.232696336 +0000 UTC m=+1760.014229968" observedRunningTime="2026-02-21 22:15:45.859067496 +0000 UTC m=+1760.640601158" watchObservedRunningTime="2026-02-21 22:15:45.859154218 +0000 UTC m=+1760.640687860" Feb 21 22:15:52 crc kubenswrapper[4717]: I0221 22:15:52.976418 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:15:52 crc kubenswrapper[4717]: E0221 22:15:52.977876 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:15:54 crc kubenswrapper[4717]: I0221 22:15:54.923486 4717 generic.go:334] "Generic (PLEG): container finished" podID="849e3b78-1693-47ad-9fc7-63c7a188d53e" containerID="6ea996d12f3e04f27467f41df2717fe67d7c6936faa310e59d85a62f2e00dae2" exitCode=0 Feb 21 22:15:54 crc kubenswrapper[4717]: I0221 22:15:54.923640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" event={"ID":"849e3b78-1693-47ad-9fc7-63c7a188d53e","Type":"ContainerDied","Data":"6ea996d12f3e04f27467f41df2717fe67d7c6936faa310e59d85a62f2e00dae2"} Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.332268 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.514572 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52t4g\" (UniqueName: \"kubernetes.io/projected/849e3b78-1693-47ad-9fc7-63c7a188d53e-kube-api-access-52t4g\") pod \"849e3b78-1693-47ad-9fc7-63c7a188d53e\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.514727 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-ssh-key-openstack-edpm-ipam\") pod \"849e3b78-1693-47ad-9fc7-63c7a188d53e\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.514819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-inventory\") pod \"849e3b78-1693-47ad-9fc7-63c7a188d53e\" (UID: \"849e3b78-1693-47ad-9fc7-63c7a188d53e\") " Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.521113 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849e3b78-1693-47ad-9fc7-63c7a188d53e-kube-api-access-52t4g" (OuterVolumeSpecName: "kube-api-access-52t4g") pod "849e3b78-1693-47ad-9fc7-63c7a188d53e" (UID: "849e3b78-1693-47ad-9fc7-63c7a188d53e"). InnerVolumeSpecName "kube-api-access-52t4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.555015 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "849e3b78-1693-47ad-9fc7-63c7a188d53e" (UID: "849e3b78-1693-47ad-9fc7-63c7a188d53e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.562115 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-inventory" (OuterVolumeSpecName: "inventory") pod "849e3b78-1693-47ad-9fc7-63c7a188d53e" (UID: "849e3b78-1693-47ad-9fc7-63c7a188d53e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.617711 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52t4g\" (UniqueName: \"kubernetes.io/projected/849e3b78-1693-47ad-9fc7-63c7a188d53e-kube-api-access-52t4g\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.617758 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.617768 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/849e3b78-1693-47ad-9fc7-63c7a188d53e-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.946385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" event={"ID":"849e3b78-1693-47ad-9fc7-63c7a188d53e","Type":"ContainerDied","Data":"95081e7f415ee44950c3ef3450a772b9e1516180d1e400d245a2801fe24aad1b"} Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.946448 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95081e7f415ee44950c3ef3450a772b9e1516180d1e400d245a2801fe24aad1b" Feb 21 22:15:56 crc kubenswrapper[4717]: I0221 22:15:56.946478 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.127239 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml"] Feb 21 22:15:57 crc kubenswrapper[4717]: E0221 22:15:57.127756 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849e3b78-1693-47ad-9fc7-63c7a188d53e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.127819 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="849e3b78-1693-47ad-9fc7-63c7a188d53e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.128064 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="849e3b78-1693-47ad-9fc7-63c7a188d53e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.128852 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.132134 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.132882 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.133299 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.133479 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.133628 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.133755 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.134922 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.135387 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.151229 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml"] Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.228668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.228899 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229081 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229222 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229309 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bxj\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-kube-api-access-l5bxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229601 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229686 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.229841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.230041 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.230138 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332459 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332536 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332704 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.332964 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bxj\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-kube-api-access-l5bxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333331 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.333389 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.337877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.340944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.341189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.341273 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.341528 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.342234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.342492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.343034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.343161 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.343169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.343419 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.343820 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.344050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.359614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bxj\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-kube-api-access-l5bxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j6gml\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.447272 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:15:57 crc kubenswrapper[4717]: W0221 22:15:57.828080 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cddc12c_e9af_45b5_a4cc_c9c7f6d5f5f4.slice/crio-13243f9836b3495f27b9273a03781066dec759171bb93eb7d72c202f2534df3f WatchSource:0}: Error finding container 13243f9836b3495f27b9273a03781066dec759171bb93eb7d72c202f2534df3f: Status 404 returned error can't find the container with id 13243f9836b3495f27b9273a03781066dec759171bb93eb7d72c202f2534df3f Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.843057 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml"] Feb 21 22:15:57 crc kubenswrapper[4717]: I0221 22:15:57.958524 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" event={"ID":"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4","Type":"ContainerStarted","Data":"13243f9836b3495f27b9273a03781066dec759171bb93eb7d72c202f2534df3f"} Feb 21 22:15:58 crc kubenswrapper[4717]: I0221 22:15:58.975662 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" event={"ID":"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4","Type":"ContainerStarted","Data":"695ec11525d6d7d20d695b5aba6b05baa3af892901ac297204b007a2b0ddd1d9"} Feb 21 22:15:58 crc kubenswrapper[4717]: I0221 22:15:58.995932 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" podStartSLOduration=1.615883491 podStartE2EDuration="1.995913691s" podCreationTimestamp="2026-02-21 22:15:57 +0000 UTC" firstStartedPulling="2026-02-21 22:15:57.830280084 +0000 UTC m=+1772.611813706" lastFinishedPulling="2026-02-21 22:15:58.210310284 +0000 UTC m=+1772.991843906" observedRunningTime="2026-02-21 22:15:58.995268205 +0000 UTC m=+1773.776801857" watchObservedRunningTime="2026-02-21 22:15:58.995913691 +0000 UTC m=+1773.777447313" Feb 21 22:16:07 crc kubenswrapper[4717]: I0221 22:16:07.976718 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:16:07 crc kubenswrapper[4717]: E0221 22:16:07.977828 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:16:20 crc kubenswrapper[4717]: I0221 22:16:20.977145 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:16:22 crc kubenswrapper[4717]: I0221 22:16:22.197444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"356b1e9dc0936986e22c54126561b6ec1ea02e09bd7ae30c9aaca97984e50335"} Feb 21 22:16:30 crc kubenswrapper[4717]: I0221 22:16:30.187992 4717 scope.go:117] "RemoveContainer" containerID="1f0746c9b963d4516b13d5fade38b15c3fd00ff0246ea8713aa6955edb970b09" Feb 21 22:16:38 crc kubenswrapper[4717]: I0221 22:16:38.370379 4717 generic.go:334] "Generic (PLEG): container finished" podID="6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" containerID="695ec11525d6d7d20d695b5aba6b05baa3af892901ac297204b007a2b0ddd1d9" exitCode=0 Feb 21 22:16:38 crc kubenswrapper[4717]: I0221 22:16:38.370419 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" event={"ID":"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4","Type":"ContainerDied","Data":"695ec11525d6d7d20d695b5aba6b05baa3af892901ac297204b007a2b0ddd1d9"} Feb 21 22:16:39 crc kubenswrapper[4717]: I0221 22:16:39.939171 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.062573 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ovn-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.063076 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ssh-key-openstack-edpm-ipam\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.063156 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-repo-setup-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.063218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.063275 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.063557 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-telemetry-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064155 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bxj\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-kube-api-access-l5bxj\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-neutron-metadata-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-bootstrap-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064392 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064480 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064547 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-libvirt-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064585 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-nova-combined-ca-bundle\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.064711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-inventory\") pod \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\" (UID: \"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4\") " Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.072608 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.073155 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.073961 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.074190 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.074248 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.074412 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.074952 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-kube-api-access-l5bxj" (OuterVolumeSpecName: "kube-api-access-l5bxj") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "kube-api-access-l5bxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.077098 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.078520 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.080073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.083109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.088031 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.111540 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-inventory" (OuterVolumeSpecName: "inventory") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.132321 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" (UID: "6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168338 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168395 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168416 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168434 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168452 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168472 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168491 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168512 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168530 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168548 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bxj\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-kube-api-access-l5bxj\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168568 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168585 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168605 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.168624 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.395255 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" event={"ID":"6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4","Type":"ContainerDied","Data":"13243f9836b3495f27b9273a03781066dec759171bb93eb7d72c202f2534df3f"} Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.395310 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13243f9836b3495f27b9273a03781066dec759171bb93eb7d72c202f2534df3f" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.395381 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j6gml" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.553111 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx"] Feb 21 22:16:40 crc kubenswrapper[4717]: E0221 22:16:40.553816 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.553900 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.554410 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.555846 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.559738 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.560239 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.561104 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.562219 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.564845 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.571164 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx"] Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.679171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cb09f299-8779-421d-a58f-dd16db2daadb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.679612 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94db6\" (UniqueName: \"kubernetes.io/projected/cb09f299-8779-421d-a58f-dd16db2daadb-kube-api-access-94db6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.679948 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.680158 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.680388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.782535 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94db6\" (UniqueName: \"kubernetes.io/projected/cb09f299-8779-421d-a58f-dd16db2daadb-kube-api-access-94db6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.782698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.782750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.782819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.782937 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cb09f299-8779-421d-a58f-dd16db2daadb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.784172 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cb09f299-8779-421d-a58f-dd16db2daadb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.789146 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.793972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.796893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.805604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94db6\" (UniqueName: \"kubernetes.io/projected/cb09f299-8779-421d-a58f-dd16db2daadb-kube-api-access-94db6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dvvlx\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:40 crc kubenswrapper[4717]: I0221 22:16:40.893321 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:16:41 crc kubenswrapper[4717]: I0221 22:16:41.552743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx"] Feb 21 22:16:42 crc kubenswrapper[4717]: I0221 22:16:42.418493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" event={"ID":"cb09f299-8779-421d-a58f-dd16db2daadb","Type":"ContainerStarted","Data":"ab376d1bd61b47fda80840c7cb56e898dd6cf5559b541d7a3ffbd855a1004238"} Feb 21 22:16:42 crc kubenswrapper[4717]: I0221 22:16:42.418763 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" event={"ID":"cb09f299-8779-421d-a58f-dd16db2daadb","Type":"ContainerStarted","Data":"1388b049dac08fe02687c1f8518b258621e476a39acc87a3a93c988ca5667277"} Feb 21 22:16:42 crc kubenswrapper[4717]: I0221 22:16:42.442848 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" podStartSLOduration=1.956615553 podStartE2EDuration="2.442830027s" podCreationTimestamp="2026-02-21 22:16:40 +0000 UTC" firstStartedPulling="2026-02-21 22:16:41.536420313 +0000 UTC m=+1816.317953935" lastFinishedPulling="2026-02-21 22:16:42.022634777 +0000 UTC m=+1816.804168409" observedRunningTime="2026-02-21 22:16:42.441362871 +0000 UTC m=+1817.222896543" watchObservedRunningTime="2026-02-21 22:16:42.442830027 +0000 UTC m=+1817.224363649" Feb 21 22:17:49 crc kubenswrapper[4717]: I0221 22:17:49.217956 4717 generic.go:334] "Generic (PLEG): container finished" podID="cb09f299-8779-421d-a58f-dd16db2daadb" containerID="ab376d1bd61b47fda80840c7cb56e898dd6cf5559b541d7a3ffbd855a1004238" exitCode=0 Feb 21 22:17:49 crc kubenswrapper[4717]: I0221 22:17:49.218018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" event={"ID":"cb09f299-8779-421d-a58f-dd16db2daadb","Type":"ContainerDied","Data":"ab376d1bd61b47fda80840c7cb56e898dd6cf5559b541d7a3ffbd855a1004238"} Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.635376 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.749066 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-inventory\") pod \"cb09f299-8779-421d-a58f-dd16db2daadb\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.749129 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ovn-combined-ca-bundle\") pod \"cb09f299-8779-421d-a58f-dd16db2daadb\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.749198 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ssh-key-openstack-edpm-ipam\") pod \"cb09f299-8779-421d-a58f-dd16db2daadb\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.749414 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cb09f299-8779-421d-a58f-dd16db2daadb-ovncontroller-config-0\") pod \"cb09f299-8779-421d-a58f-dd16db2daadb\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.749529 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94db6\" (UniqueName: \"kubernetes.io/projected/cb09f299-8779-421d-a58f-dd16db2daadb-kube-api-access-94db6\") pod \"cb09f299-8779-421d-a58f-dd16db2daadb\" (UID: \"cb09f299-8779-421d-a58f-dd16db2daadb\") " Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.756416 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "cb09f299-8779-421d-a58f-dd16db2daadb" (UID: "cb09f299-8779-421d-a58f-dd16db2daadb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.765296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb09f299-8779-421d-a58f-dd16db2daadb-kube-api-access-94db6" (OuterVolumeSpecName: "kube-api-access-94db6") pod "cb09f299-8779-421d-a58f-dd16db2daadb" (UID: "cb09f299-8779-421d-a58f-dd16db2daadb"). InnerVolumeSpecName "kube-api-access-94db6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.779526 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-inventory" (OuterVolumeSpecName: "inventory") pod "cb09f299-8779-421d-a58f-dd16db2daadb" (UID: "cb09f299-8779-421d-a58f-dd16db2daadb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.782776 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb09f299-8779-421d-a58f-dd16db2daadb-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "cb09f299-8779-421d-a58f-dd16db2daadb" (UID: "cb09f299-8779-421d-a58f-dd16db2daadb"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.799944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cb09f299-8779-421d-a58f-dd16db2daadb" (UID: "cb09f299-8779-421d-a58f-dd16db2daadb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.864096 4717 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/cb09f299-8779-421d-a58f-dd16db2daadb-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.864139 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94db6\" (UniqueName: \"kubernetes.io/projected/cb09f299-8779-421d-a58f-dd16db2daadb-kube-api-access-94db6\") on node \"crc\" DevicePath \"\"" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.864170 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.864182 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:17:50 crc kubenswrapper[4717]: I0221 22:17:50.864194 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb09f299-8779-421d-a58f-dd16db2daadb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.242235 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" event={"ID":"cb09f299-8779-421d-a58f-dd16db2daadb","Type":"ContainerDied","Data":"1388b049dac08fe02687c1f8518b258621e476a39acc87a3a93c988ca5667277"} Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.242295 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1388b049dac08fe02687c1f8518b258621e476a39acc87a3a93c988ca5667277" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.242326 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dvvlx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.364358 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx"] Feb 21 22:17:51 crc kubenswrapper[4717]: E0221 22:17:51.364857 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb09f299-8779-421d-a58f-dd16db2daadb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.364906 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb09f299-8779-421d-a58f-dd16db2daadb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.365155 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb09f299-8779-421d-a58f-dd16db2daadb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.365987 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.368302 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.368638 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.368837 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.369131 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.369288 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.369439 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.393252 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx"] Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.473992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.474046 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.474100 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.474121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhpxf\" (UniqueName: \"kubernetes.io/projected/788c0818-654d-4001-a3ab-06c9dbd10592-kube-api-access-nhpxf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.474157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.474182 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.576182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.576272 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.576346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.576376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhpxf\" (UniqueName: \"kubernetes.io/projected/788c0818-654d-4001-a3ab-06c9dbd10592-kube-api-access-nhpxf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.576415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.576445 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.581744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.582684 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.585168 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.585411 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.601911 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.608918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhpxf\" (UniqueName: \"kubernetes.io/projected/788c0818-654d-4001-a3ab-06c9dbd10592-kube-api-access-nhpxf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:51 crc kubenswrapper[4717]: I0221 22:17:51.685329 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:17:52 crc kubenswrapper[4717]: I0221 22:17:52.073876 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx"] Feb 21 22:17:52 crc kubenswrapper[4717]: I0221 22:17:52.103644 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:17:52 crc kubenswrapper[4717]: I0221 22:17:52.250813 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" event={"ID":"788c0818-654d-4001-a3ab-06c9dbd10592","Type":"ContainerStarted","Data":"c5e0f872b90a52be274814e768769475d69bc73b7a25c2ce2506b7b574cde6de"} Feb 21 22:17:53 crc kubenswrapper[4717]: I0221 22:17:53.261850 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" event={"ID":"788c0818-654d-4001-a3ab-06c9dbd10592","Type":"ContainerStarted","Data":"ed3ab444c52c48b732b845217322b420580c355d84233926a77e1cc10a0874a3"} Feb 21 22:17:53 crc kubenswrapper[4717]: I0221 22:17:53.297180 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" podStartSLOduration=1.864868126 podStartE2EDuration="2.297097156s" podCreationTimestamp="2026-02-21 22:17:51 +0000 UTC" firstStartedPulling="2026-02-21 22:17:52.103422702 +0000 UTC m=+1886.884956324" lastFinishedPulling="2026-02-21 22:17:52.535651692 +0000 UTC m=+1887.317185354" observedRunningTime="2026-02-21 22:17:53.279324207 +0000 UTC m=+1888.060857829" watchObservedRunningTime="2026-02-21 22:17:53.297097156 +0000 UTC m=+1888.078630818" Feb 21 22:18:39 crc kubenswrapper[4717]: I0221 22:18:39.063201 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:18:39 crc kubenswrapper[4717]: I0221 22:18:39.063905 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:18:44 crc kubenswrapper[4717]: I0221 22:18:44.811617 4717 generic.go:334] "Generic (PLEG): container finished" podID="788c0818-654d-4001-a3ab-06c9dbd10592" containerID="ed3ab444c52c48b732b845217322b420580c355d84233926a77e1cc10a0874a3" exitCode=0 Feb 21 22:18:44 crc kubenswrapper[4717]: I0221 22:18:44.811747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" event={"ID":"788c0818-654d-4001-a3ab-06c9dbd10592","Type":"ContainerDied","Data":"ed3ab444c52c48b732b845217322b420580c355d84233926a77e1cc10a0874a3"} Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.405769 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.577781 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-metadata-combined-ca-bundle\") pod \"788c0818-654d-4001-a3ab-06c9dbd10592\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.578050 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-nova-metadata-neutron-config-0\") pod \"788c0818-654d-4001-a3ab-06c9dbd10592\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.578106 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-inventory\") pod \"788c0818-654d-4001-a3ab-06c9dbd10592\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.578181 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-ovn-metadata-agent-neutron-config-0\") pod \"788c0818-654d-4001-a3ab-06c9dbd10592\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.578289 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhpxf\" (UniqueName: \"kubernetes.io/projected/788c0818-654d-4001-a3ab-06c9dbd10592-kube-api-access-nhpxf\") pod \"788c0818-654d-4001-a3ab-06c9dbd10592\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.578565 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-ssh-key-openstack-edpm-ipam\") pod \"788c0818-654d-4001-a3ab-06c9dbd10592\" (UID: \"788c0818-654d-4001-a3ab-06c9dbd10592\") " Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.586817 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788c0818-654d-4001-a3ab-06c9dbd10592-kube-api-access-nhpxf" (OuterVolumeSpecName: "kube-api-access-nhpxf") pod "788c0818-654d-4001-a3ab-06c9dbd10592" (UID: "788c0818-654d-4001-a3ab-06c9dbd10592"). InnerVolumeSpecName "kube-api-access-nhpxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.587156 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "788c0818-654d-4001-a3ab-06c9dbd10592" (UID: "788c0818-654d-4001-a3ab-06c9dbd10592"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.617116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "788c0818-654d-4001-a3ab-06c9dbd10592" (UID: "788c0818-654d-4001-a3ab-06c9dbd10592"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.628520 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "788c0818-654d-4001-a3ab-06c9dbd10592" (UID: "788c0818-654d-4001-a3ab-06c9dbd10592"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.629832 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "788c0818-654d-4001-a3ab-06c9dbd10592" (UID: "788c0818-654d-4001-a3ab-06c9dbd10592"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.651963 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-inventory" (OuterVolumeSpecName: "inventory") pod "788c0818-654d-4001-a3ab-06c9dbd10592" (UID: "788c0818-654d-4001-a3ab-06c9dbd10592"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.683678 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.683738 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.683759 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.683778 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.683798 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhpxf\" (UniqueName: \"kubernetes.io/projected/788c0818-654d-4001-a3ab-06c9dbd10592-kube-api-access-nhpxf\") on node \"crc\" DevicePath \"\"" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.683816 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/788c0818-654d-4001-a3ab-06c9dbd10592-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.840181 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" event={"ID":"788c0818-654d-4001-a3ab-06c9dbd10592","Type":"ContainerDied","Data":"c5e0f872b90a52be274814e768769475d69bc73b7a25c2ce2506b7b574cde6de"} Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.840221 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e0f872b90a52be274814e768769475d69bc73b7a25c2ce2506b7b574cde6de" Feb 21 22:18:46 crc kubenswrapper[4717]: I0221 22:18:46.840278 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.045794 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84"] Feb 21 22:18:47 crc kubenswrapper[4717]: E0221 22:18:47.046314 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788c0818-654d-4001-a3ab-06c9dbd10592" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.046332 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="788c0818-654d-4001-a3ab-06c9dbd10592" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.046566 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="788c0818-654d-4001-a3ab-06c9dbd10592" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.047402 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.050187 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.050510 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.050579 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.050649 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.050847 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.058771 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84"] Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.191651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.191805 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zw8\" (UniqueName: \"kubernetes.io/projected/4dac73d5-a471-44a7-a91a-3422e09c7bb0-kube-api-access-l4zw8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.191887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.191988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.192070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.293172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zw8\" (UniqueName: \"kubernetes.io/projected/4dac73d5-a471-44a7-a91a-3422e09c7bb0-kube-api-access-l4zw8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.293252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.293340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.293384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.293426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.297634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.306848 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.310294 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.312502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.318784 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zw8\" (UniqueName: \"kubernetes.io/projected/4dac73d5-a471-44a7-a91a-3422e09c7bb0-kube-api-access-l4zw8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xpj84\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.381175 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:18:47 crc kubenswrapper[4717]: W0221 22:18:47.922233 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dac73d5_a471_44a7_a91a_3422e09c7bb0.slice/crio-f119328bcdbb2308d337912c5982e3db1bece291bc0cf54f89af0f9d518049d2 WatchSource:0}: Error finding container f119328bcdbb2308d337912c5982e3db1bece291bc0cf54f89af0f9d518049d2: Status 404 returned error can't find the container with id f119328bcdbb2308d337912c5982e3db1bece291bc0cf54f89af0f9d518049d2 Feb 21 22:18:47 crc kubenswrapper[4717]: I0221 22:18:47.922645 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84"] Feb 21 22:18:48 crc kubenswrapper[4717]: I0221 22:18:48.863462 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" event={"ID":"4dac73d5-a471-44a7-a91a-3422e09c7bb0","Type":"ContainerStarted","Data":"65246598cc847d22a8bd2c288d97edb548f6d4b6c1cddca468bc2941289f0b86"} Feb 21 22:18:48 crc kubenswrapper[4717]: I0221 22:18:48.863968 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" event={"ID":"4dac73d5-a471-44a7-a91a-3422e09c7bb0","Type":"ContainerStarted","Data":"f119328bcdbb2308d337912c5982e3db1bece291bc0cf54f89af0f9d518049d2"} Feb 21 22:18:48 crc kubenswrapper[4717]: I0221 22:18:48.918037 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" podStartSLOduration=1.3850193530000001 podStartE2EDuration="1.918012158s" podCreationTimestamp="2026-02-21 22:18:47 +0000 UTC" firstStartedPulling="2026-02-21 22:18:47.926917819 +0000 UTC m=+1942.708451451" lastFinishedPulling="2026-02-21 22:18:48.459910624 +0000 UTC m=+1943.241444256" observedRunningTime="2026-02-21 22:18:48.886615058 +0000 UTC m=+1943.668148720" watchObservedRunningTime="2026-02-21 22:18:48.918012158 +0000 UTC m=+1943.699545790" Feb 21 22:19:09 crc kubenswrapper[4717]: I0221 22:19:09.062221 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:19:09 crc kubenswrapper[4717]: I0221 22:19:09.063017 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.063184 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.064042 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.064105 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.065304 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"356b1e9dc0936986e22c54126561b6ec1ea02e09bd7ae30c9aaca97984e50335"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.065409 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://356b1e9dc0936986e22c54126561b6ec1ea02e09bd7ae30c9aaca97984e50335" gracePeriod=600 Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.432333 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="356b1e9dc0936986e22c54126561b6ec1ea02e09bd7ae30c9aaca97984e50335" exitCode=0 Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.432568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"356b1e9dc0936986e22c54126561b6ec1ea02e09bd7ae30c9aaca97984e50335"} Feb 21 22:19:39 crc kubenswrapper[4717]: I0221 22:19:39.432772 4717 scope.go:117] "RemoveContainer" containerID="fc0b3d91ff99a158b18e6dc8eedf68b484e8699b7ec9629b5a13c53ce46cb97d" Feb 21 22:19:40 crc kubenswrapper[4717]: I0221 22:19:40.448595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642"} Feb 21 22:21:39 crc kubenswrapper[4717]: I0221 22:21:39.062428 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:21:39 crc kubenswrapper[4717]: I0221 22:21:39.062937 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:22:09 crc kubenswrapper[4717]: I0221 22:22:09.063310 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:22:09 crc kubenswrapper[4717]: I0221 22:22:09.064139 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.412612 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bjf49"] Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.415480 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.422992 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjf49"] Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.457112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p265\" (UniqueName: \"kubernetes.io/projected/48a83cf4-e5b8-4226-879a-1363e6006e98-kube-api-access-8p265\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.457235 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-utilities\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.457258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-catalog-content\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.559602 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-utilities\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.559663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-catalog-content\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.559825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p265\" (UniqueName: \"kubernetes.io/projected/48a83cf4-e5b8-4226-879a-1363e6006e98-kube-api-access-8p265\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.560241 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-utilities\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.560324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-catalog-content\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.578665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p265\" (UniqueName: \"kubernetes.io/projected/48a83cf4-e5b8-4226-879a-1363e6006e98-kube-api-access-8p265\") pod \"redhat-marketplace-bjf49\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:27 crc kubenswrapper[4717]: I0221 22:22:27.736608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:28 crc kubenswrapper[4717]: I0221 22:22:28.233363 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjf49"] Feb 21 22:22:28 crc kubenswrapper[4717]: W0221 22:22:28.240120 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a83cf4_e5b8_4226_879a_1363e6006e98.slice/crio-fb8118613c1e550be8ab108d3ddcc6085c44804dda882ff0d4950252692d8c3b WatchSource:0}: Error finding container fb8118613c1e550be8ab108d3ddcc6085c44804dda882ff0d4950252692d8c3b: Status 404 returned error can't find the container with id fb8118613c1e550be8ab108d3ddcc6085c44804dda882ff0d4950252692d8c3b Feb 21 22:22:28 crc kubenswrapper[4717]: I0221 22:22:28.350994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjf49" event={"ID":"48a83cf4-e5b8-4226-879a-1363e6006e98","Type":"ContainerStarted","Data":"fb8118613c1e550be8ab108d3ddcc6085c44804dda882ff0d4950252692d8c3b"} Feb 21 22:22:29 crc kubenswrapper[4717]: I0221 22:22:29.361807 4717 generic.go:334] "Generic (PLEG): container finished" podID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerID="1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b" exitCode=0 Feb 21 22:22:29 crc kubenswrapper[4717]: I0221 22:22:29.361897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjf49" event={"ID":"48a83cf4-e5b8-4226-879a-1363e6006e98","Type":"ContainerDied","Data":"1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b"} Feb 21 22:22:31 crc kubenswrapper[4717]: I0221 22:22:31.382666 4717 generic.go:334] "Generic (PLEG): container finished" podID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerID="97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387" exitCode=0 Feb 21 22:22:31 crc kubenswrapper[4717]: I0221 22:22:31.382779 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjf49" event={"ID":"48a83cf4-e5b8-4226-879a-1363e6006e98","Type":"ContainerDied","Data":"97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387"} Feb 21 22:22:32 crc kubenswrapper[4717]: I0221 22:22:32.395417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjf49" event={"ID":"48a83cf4-e5b8-4226-879a-1363e6006e98","Type":"ContainerStarted","Data":"ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b"} Feb 21 22:22:37 crc kubenswrapper[4717]: I0221 22:22:37.736843 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:37 crc kubenswrapper[4717]: I0221 22:22:37.737570 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:37 crc kubenswrapper[4717]: I0221 22:22:37.810098 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:37 crc kubenswrapper[4717]: I0221 22:22:37.852016 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bjf49" podStartSLOduration=8.43468328 podStartE2EDuration="10.851993714s" podCreationTimestamp="2026-02-21 22:22:27 +0000 UTC" firstStartedPulling="2026-02-21 22:22:29.364053423 +0000 UTC m=+2164.145587045" lastFinishedPulling="2026-02-21 22:22:31.781363857 +0000 UTC m=+2166.562897479" observedRunningTime="2026-02-21 22:22:32.41684112 +0000 UTC m=+2167.198374762" watchObservedRunningTime="2026-02-21 22:22:37.851993714 +0000 UTC m=+2172.633527346" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.265274 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zknb9"] Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.267622 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.285081 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zknb9"] Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.402076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-catalog-content\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.402217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-utilities\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.402260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjz6\" (UniqueName: \"kubernetes.io/projected/caf52af6-209b-42df-bb36-dcc6c3f94a63-kube-api-access-5fjz6\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.500876 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.503987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-catalog-content\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.504107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-utilities\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.504154 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fjz6\" (UniqueName: \"kubernetes.io/projected/caf52af6-209b-42df-bb36-dcc6c3f94a63-kube-api-access-5fjz6\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.504753 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-utilities\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.504947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-catalog-content\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.527016 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fjz6\" (UniqueName: \"kubernetes.io/projected/caf52af6-209b-42df-bb36-dcc6c3f94a63-kube-api-access-5fjz6\") pod \"certified-operators-zknb9\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:38 crc kubenswrapper[4717]: I0221 22:22:38.587428 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.062288 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.062883 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.062926 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.063660 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.063716 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" gracePeriod=600 Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.109966 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zknb9"] Feb 21 22:22:39 crc kubenswrapper[4717]: E0221 22:22:39.186303 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.467648 4717 generic.go:334] "Generic (PLEG): container finished" podID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerID="eb340ed774ffdad49991eab943abc288725cf920deec14a0b8ea2c25b4f0a658" exitCode=0 Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.467768 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerDied","Data":"eb340ed774ffdad49991eab943abc288725cf920deec14a0b8ea2c25b4f0a658"} Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.467823 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerStarted","Data":"a47bc0024eea37704da0e77391d74494defd86cfee38994d7bcfe78426e268a9"} Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.473451 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" exitCode=0 Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.473547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642"} Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.473660 4717 scope.go:117] "RemoveContainer" containerID="356b1e9dc0936986e22c54126561b6ec1ea02e09bd7ae30c9aaca97984e50335" Feb 21 22:22:39 crc kubenswrapper[4717]: I0221 22:22:39.475288 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:22:39 crc kubenswrapper[4717]: E0221 22:22:39.475782 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:22:40 crc kubenswrapper[4717]: I0221 22:22:40.490692 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerStarted","Data":"b4c46acdfef739bef3c12893d005fb1227e9de1d5604b3665fca46fd59620e02"} Feb 21 22:22:40 crc kubenswrapper[4717]: I0221 22:22:40.876692 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjf49"] Feb 21 22:22:40 crc kubenswrapper[4717]: I0221 22:22:40.877044 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bjf49" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="registry-server" containerID="cri-o://ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b" gracePeriod=2 Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.468250 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.541406 4717 generic.go:334] "Generic (PLEG): container finished" podID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerID="ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b" exitCode=0 Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.541487 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjf49" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.541514 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjf49" event={"ID":"48a83cf4-e5b8-4226-879a-1363e6006e98","Type":"ContainerDied","Data":"ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b"} Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.541582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjf49" event={"ID":"48a83cf4-e5b8-4226-879a-1363e6006e98","Type":"ContainerDied","Data":"fb8118613c1e550be8ab108d3ddcc6085c44804dda882ff0d4950252692d8c3b"} Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.541610 4717 scope.go:117] "RemoveContainer" containerID="ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.545119 4717 generic.go:334] "Generic (PLEG): container finished" podID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerID="b4c46acdfef739bef3c12893d005fb1227e9de1d5604b3665fca46fd59620e02" exitCode=0 Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.545165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerDied","Data":"b4c46acdfef739bef3c12893d005fb1227e9de1d5604b3665fca46fd59620e02"} Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.571853 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-utilities\") pod \"48a83cf4-e5b8-4226-879a-1363e6006e98\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.572186 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-catalog-content\") pod \"48a83cf4-e5b8-4226-879a-1363e6006e98\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.572310 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p265\" (UniqueName: \"kubernetes.io/projected/48a83cf4-e5b8-4226-879a-1363e6006e98-kube-api-access-8p265\") pod \"48a83cf4-e5b8-4226-879a-1363e6006e98\" (UID: \"48a83cf4-e5b8-4226-879a-1363e6006e98\") " Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.577502 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-utilities" (OuterVolumeSpecName: "utilities") pod "48a83cf4-e5b8-4226-879a-1363e6006e98" (UID: "48a83cf4-e5b8-4226-879a-1363e6006e98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.590453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a83cf4-e5b8-4226-879a-1363e6006e98-kube-api-access-8p265" (OuterVolumeSpecName: "kube-api-access-8p265") pod "48a83cf4-e5b8-4226-879a-1363e6006e98" (UID: "48a83cf4-e5b8-4226-879a-1363e6006e98"). InnerVolumeSpecName "kube-api-access-8p265". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.600878 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48a83cf4-e5b8-4226-879a-1363e6006e98" (UID: "48a83cf4-e5b8-4226-879a-1363e6006e98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.614483 4717 scope.go:117] "RemoveContainer" containerID="97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.647237 4717 scope.go:117] "RemoveContainer" containerID="1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.674210 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.674571 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p265\" (UniqueName: \"kubernetes.io/projected/48a83cf4-e5b8-4226-879a-1363e6006e98-kube-api-access-8p265\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.674582 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a83cf4-e5b8-4226-879a-1363e6006e98-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.679791 4717 scope.go:117] "RemoveContainer" containerID="ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b" Feb 21 22:22:41 crc kubenswrapper[4717]: E0221 22:22:41.680265 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b\": container with ID starting with ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b not found: ID does not exist" containerID="ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.680308 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b"} err="failed to get container status \"ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b\": rpc error: code = NotFound desc = could not find container \"ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b\": container with ID starting with ebcec87b6d9e131d4d6587ff9cf7c15704c56dcc7e4757f86a13be56e604bb9b not found: ID does not exist" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.680336 4717 scope.go:117] "RemoveContainer" containerID="97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387" Feb 21 22:22:41 crc kubenswrapper[4717]: E0221 22:22:41.680748 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387\": container with ID starting with 97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387 not found: ID does not exist" containerID="97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.680784 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387"} err="failed to get container status \"97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387\": rpc error: code = NotFound desc = could not find container \"97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387\": container with ID starting with 97b1c74a1942010e83ce44e742d43e9b8b71d3bfd2172bbfcff4a2ef611f2387 not found: ID does not exist" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.680805 4717 scope.go:117] "RemoveContainer" containerID="1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b" Feb 21 22:22:41 crc kubenswrapper[4717]: E0221 22:22:41.681096 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b\": container with ID starting with 1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b not found: ID does not exist" containerID="1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.681131 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b"} err="failed to get container status \"1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b\": rpc error: code = NotFound desc = could not find container \"1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b\": container with ID starting with 1794c5827f1eee7bd45326ca61bded91d19b41356fa015c9f4280aa8d3bbcf0b not found: ID does not exist" Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.888932 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjf49"] Feb 21 22:22:41 crc kubenswrapper[4717]: I0221 22:22:41.896508 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjf49"] Feb 21 22:22:42 crc kubenswrapper[4717]: I0221 22:22:42.001693 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" path="/var/lib/kubelet/pods/48a83cf4-e5b8-4226-879a-1363e6006e98/volumes" Feb 21 22:22:42 crc kubenswrapper[4717]: I0221 22:22:42.560087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerStarted","Data":"161791f11f8205641e34caad4bfaea0f67b96339a063e306d9823f2e86919bcb"} Feb 21 22:22:42 crc kubenswrapper[4717]: I0221 22:22:42.594187 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zknb9" podStartSLOduration=2.081054299 podStartE2EDuration="4.594161372s" podCreationTimestamp="2026-02-21 22:22:38 +0000 UTC" firstStartedPulling="2026-02-21 22:22:39.471600269 +0000 UTC m=+2174.253133931" lastFinishedPulling="2026-02-21 22:22:41.984707372 +0000 UTC m=+2176.766241004" observedRunningTime="2026-02-21 22:22:42.578260913 +0000 UTC m=+2177.359794585" watchObservedRunningTime="2026-02-21 22:22:42.594161372 +0000 UTC m=+2177.375695004" Feb 21 22:22:47 crc kubenswrapper[4717]: I0221 22:22:47.619154 4717 generic.go:334] "Generic (PLEG): container finished" podID="4dac73d5-a471-44a7-a91a-3422e09c7bb0" containerID="65246598cc847d22a8bd2c288d97edb548f6d4b6c1cddca468bc2941289f0b86" exitCode=0 Feb 21 22:22:47 crc kubenswrapper[4717]: I0221 22:22:47.619292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" event={"ID":"4dac73d5-a471-44a7-a91a-3422e09c7bb0","Type":"ContainerDied","Data":"65246598cc847d22a8bd2c288d97edb548f6d4b6c1cddca468bc2941289f0b86"} Feb 21 22:22:48 crc kubenswrapper[4717]: I0221 22:22:48.588519 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:48 crc kubenswrapper[4717]: I0221 22:22:48.588820 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:48 crc kubenswrapper[4717]: I0221 22:22:48.670842 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:48 crc kubenswrapper[4717]: I0221 22:22:48.768415 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:48 crc kubenswrapper[4717]: I0221 22:22:48.919443 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zknb9"] Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.123126 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.151415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4zw8\" (UniqueName: \"kubernetes.io/projected/4dac73d5-a471-44a7-a91a-3422e09c7bb0-kube-api-access-l4zw8\") pod \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.151555 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-inventory\") pod \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.151642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-combined-ca-bundle\") pod \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.151716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-ssh-key-openstack-edpm-ipam\") pod \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.151753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-secret-0\") pod \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\" (UID: \"4dac73d5-a471-44a7-a91a-3422e09c7bb0\") " Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.158588 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dac73d5-a471-44a7-a91a-3422e09c7bb0-kube-api-access-l4zw8" (OuterVolumeSpecName: "kube-api-access-l4zw8") pod "4dac73d5-a471-44a7-a91a-3422e09c7bb0" (UID: "4dac73d5-a471-44a7-a91a-3422e09c7bb0"). InnerVolumeSpecName "kube-api-access-l4zw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.159677 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4dac73d5-a471-44a7-a91a-3422e09c7bb0" (UID: "4dac73d5-a471-44a7-a91a-3422e09c7bb0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.183962 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-inventory" (OuterVolumeSpecName: "inventory") pod "4dac73d5-a471-44a7-a91a-3422e09c7bb0" (UID: "4dac73d5-a471-44a7-a91a-3422e09c7bb0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.203262 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4dac73d5-a471-44a7-a91a-3422e09c7bb0" (UID: "4dac73d5-a471-44a7-a91a-3422e09c7bb0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.203408 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "4dac73d5-a471-44a7-a91a-3422e09c7bb0" (UID: "4dac73d5-a471-44a7-a91a-3422e09c7bb0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.255020 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.255050 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.255063 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.255074 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4zw8\" (UniqueName: \"kubernetes.io/projected/4dac73d5-a471-44a7-a91a-3422e09c7bb0-kube-api-access-l4zw8\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.255084 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dac73d5-a471-44a7-a91a-3422e09c7bb0-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.641478 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" event={"ID":"4dac73d5-a471-44a7-a91a-3422e09c7bb0","Type":"ContainerDied","Data":"f119328bcdbb2308d337912c5982e3db1bece291bc0cf54f89af0f9d518049d2"} Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.641825 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f119328bcdbb2308d337912c5982e3db1bece291bc0cf54f89af0f9d518049d2" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.641563 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xpj84" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.777281 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng"] Feb 21 22:22:49 crc kubenswrapper[4717]: E0221 22:22:49.777844 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="extract-utilities" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.777892 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="extract-utilities" Feb 21 22:22:49 crc kubenswrapper[4717]: E0221 22:22:49.777915 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="registry-server" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.777925 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="registry-server" Feb 21 22:22:49 crc kubenswrapper[4717]: E0221 22:22:49.777962 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dac73d5-a471-44a7-a91a-3422e09c7bb0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.777974 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dac73d5-a471-44a7-a91a-3422e09c7bb0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 21 22:22:49 crc kubenswrapper[4717]: E0221 22:22:49.778005 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="extract-content" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.778016 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="extract-content" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.778292 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a83cf4-e5b8-4226-879a-1363e6006e98" containerName="registry-server" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.778339 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dac73d5-a471-44a7-a91a-3422e09c7bb0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.779267 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.792548 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.792826 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.793356 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.794103 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.794248 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.794389 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.794526 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.801640 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng"] Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.867666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.867726 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.867849 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.867970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868011 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868038 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868071 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868119 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868386 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9bm9\" (UniqueName: \"kubernetes.io/projected/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-kube-api-access-m9bm9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.868506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.969931 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9bm9\" (UniqueName: \"kubernetes.io/projected/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-kube-api-access-m9bm9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970247 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970306 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970403 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970581 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.970717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.972786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.976212 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.977112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.977614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.977745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.977969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.978429 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.979879 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.980171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:49 crc kubenswrapper[4717]: I0221 22:22:49.985189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:50 crc kubenswrapper[4717]: I0221 22:22:50.005655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9bm9\" (UniqueName: \"kubernetes.io/projected/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-kube-api-access-m9bm9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7l5ng\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:50 crc kubenswrapper[4717]: I0221 22:22:50.106116 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:22:50 crc kubenswrapper[4717]: I0221 22:22:50.654538 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zknb9" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="registry-server" containerID="cri-o://161791f11f8205641e34caad4bfaea0f67b96339a063e306d9823f2e86919bcb" gracePeriod=2 Feb 21 22:22:50 crc kubenswrapper[4717]: I0221 22:22:50.698646 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng"] Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.677410 4717 generic.go:334] "Generic (PLEG): container finished" podID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerID="161791f11f8205641e34caad4bfaea0f67b96339a063e306d9823f2e86919bcb" exitCode=0 Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.677487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerDied","Data":"161791f11f8205641e34caad4bfaea0f67b96339a063e306d9823f2e86919bcb"} Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.678091 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zknb9" event={"ID":"caf52af6-209b-42df-bb36-dcc6c3f94a63","Type":"ContainerDied","Data":"a47bc0024eea37704da0e77391d74494defd86cfee38994d7bcfe78426e268a9"} Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.678137 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47bc0024eea37704da0e77391d74494defd86cfee38994d7bcfe78426e268a9" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.682754 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" event={"ID":"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c","Type":"ContainerStarted","Data":"1f4b7c01ce6e277a4f2c79c13a0743cc80c852078cf7001aef439cd25ef0e91f"} Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.682781 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" event={"ID":"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c","Type":"ContainerStarted","Data":"01eb37659c2aa252422fe3bd6e58068083af986794fe427214229fd1a2bc28c2"} Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.712834 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.719409 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" podStartSLOduration=2.293288529 podStartE2EDuration="2.719391811s" podCreationTimestamp="2026-02-21 22:22:49 +0000 UTC" firstStartedPulling="2026-02-21 22:22:50.707719009 +0000 UTC m=+2185.489252641" lastFinishedPulling="2026-02-21 22:22:51.133822271 +0000 UTC m=+2185.915355923" observedRunningTime="2026-02-21 22:22:51.706548164 +0000 UTC m=+2186.488081806" watchObservedRunningTime="2026-02-21 22:22:51.719391811 +0000 UTC m=+2186.500925443" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.811361 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-catalog-content\") pod \"caf52af6-209b-42df-bb36-dcc6c3f94a63\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.811457 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fjz6\" (UniqueName: \"kubernetes.io/projected/caf52af6-209b-42df-bb36-dcc6c3f94a63-kube-api-access-5fjz6\") pod \"caf52af6-209b-42df-bb36-dcc6c3f94a63\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.811564 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-utilities\") pod \"caf52af6-209b-42df-bb36-dcc6c3f94a63\" (UID: \"caf52af6-209b-42df-bb36-dcc6c3f94a63\") " Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.812527 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-utilities" (OuterVolumeSpecName: "utilities") pod "caf52af6-209b-42df-bb36-dcc6c3f94a63" (UID: "caf52af6-209b-42df-bb36-dcc6c3f94a63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.817281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf52af6-209b-42df-bb36-dcc6c3f94a63-kube-api-access-5fjz6" (OuterVolumeSpecName: "kube-api-access-5fjz6") pod "caf52af6-209b-42df-bb36-dcc6c3f94a63" (UID: "caf52af6-209b-42df-bb36-dcc6c3f94a63"). InnerVolumeSpecName "kube-api-access-5fjz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.862774 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caf52af6-209b-42df-bb36-dcc6c3f94a63" (UID: "caf52af6-209b-42df-bb36-dcc6c3f94a63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.913151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fjz6\" (UniqueName: \"kubernetes.io/projected/caf52af6-209b-42df-bb36-dcc6c3f94a63-kube-api-access-5fjz6\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.913181 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.913190 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caf52af6-209b-42df-bb36-dcc6c3f94a63-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:22:51 crc kubenswrapper[4717]: I0221 22:22:51.976661 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:22:51 crc kubenswrapper[4717]: E0221 22:22:51.977557 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:22:52 crc kubenswrapper[4717]: I0221 22:22:52.702027 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zknb9" Feb 21 22:22:52 crc kubenswrapper[4717]: I0221 22:22:52.731892 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zknb9"] Feb 21 22:22:52 crc kubenswrapper[4717]: I0221 22:22:52.754349 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zknb9"] Feb 21 22:22:54 crc kubenswrapper[4717]: I0221 22:22:54.000514 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" path="/var/lib/kubelet/pods/caf52af6-209b-42df-bb36-dcc6c3f94a63/volumes" Feb 21 22:23:06 crc kubenswrapper[4717]: I0221 22:23:06.976236 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:23:06 crc kubenswrapper[4717]: E0221 22:23:06.977192 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:23:17 crc kubenswrapper[4717]: I0221 22:23:17.977481 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:23:17 crc kubenswrapper[4717]: E0221 22:23:17.978663 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.608514 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8xdd"] Feb 21 22:23:24 crc kubenswrapper[4717]: E0221 22:23:24.644194 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="registry-server" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.644426 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="registry-server" Feb 21 22:23:24 crc kubenswrapper[4717]: E0221 22:23:24.644884 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="extract-utilities" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.645262 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="extract-utilities" Feb 21 22:23:24 crc kubenswrapper[4717]: E0221 22:23:24.645361 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="extract-content" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.645369 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="extract-content" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.648899 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf52af6-209b-42df-bb36-dcc6c3f94a63" containerName="registry-server" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.662943 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8xdd"] Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.663846 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.808830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-utilities\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.808960 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-catalog-content\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.809033 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfpdx\" (UniqueName: \"kubernetes.io/projected/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-kube-api-access-bfpdx\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.910627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-utilities\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.911067 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-catalog-content\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.911223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfpdx\" (UniqueName: \"kubernetes.io/projected/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-kube-api-access-bfpdx\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.911279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-utilities\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.911595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-catalog-content\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.935434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfpdx\" (UniqueName: \"kubernetes.io/projected/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-kube-api-access-bfpdx\") pod \"redhat-operators-s8xdd\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:24 crc kubenswrapper[4717]: I0221 22:23:24.991700 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:25 crc kubenswrapper[4717]: I0221 22:23:25.470142 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8xdd"] Feb 21 22:23:26 crc kubenswrapper[4717]: I0221 22:23:26.115474 4717 generic.go:334] "Generic (PLEG): container finished" podID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerID="370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a" exitCode=0 Feb 21 22:23:26 crc kubenswrapper[4717]: I0221 22:23:26.115577 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerDied","Data":"370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a"} Feb 21 22:23:26 crc kubenswrapper[4717]: I0221 22:23:26.115747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerStarted","Data":"49f8f0f2e4517d72885a7bb507d8a06c68f2db6a51c708db8e553122ef4164ad"} Feb 21 22:23:26 crc kubenswrapper[4717]: I0221 22:23:26.117111 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.000927 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l9zw8"] Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.003629 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.011594 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9zw8"] Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.053233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-catalog-content\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.053448 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-utilities\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.053554 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5kq\" (UniqueName: \"kubernetes.io/projected/795b212f-ba82-4863-932e-10b9de78ac4e-kube-api-access-rv5kq\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.125737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerStarted","Data":"fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116"} Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.155879 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-catalog-content\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.155980 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-utilities\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.156013 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5kq\" (UniqueName: \"kubernetes.io/projected/795b212f-ba82-4863-932e-10b9de78ac4e-kube-api-access-rv5kq\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.156442 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-catalog-content\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.156619 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-utilities\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.183495 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5kq\" (UniqueName: \"kubernetes.io/projected/795b212f-ba82-4863-932e-10b9de78ac4e-kube-api-access-rv5kq\") pod \"community-operators-l9zw8\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.343543 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:27 crc kubenswrapper[4717]: I0221 22:23:27.900193 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9zw8"] Feb 21 22:23:27 crc kubenswrapper[4717]: W0221 22:23:27.905164 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795b212f_ba82_4863_932e_10b9de78ac4e.slice/crio-c95553e650db78edec1ada08c5d7f5222e5baab342a583906685ff31237dcd17 WatchSource:0}: Error finding container c95553e650db78edec1ada08c5d7f5222e5baab342a583906685ff31237dcd17: Status 404 returned error can't find the container with id c95553e650db78edec1ada08c5d7f5222e5baab342a583906685ff31237dcd17 Feb 21 22:23:28 crc kubenswrapper[4717]: I0221 22:23:28.139526 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerStarted","Data":"c95553e650db78edec1ada08c5d7f5222e5baab342a583906685ff31237dcd17"} Feb 21 22:23:29 crc kubenswrapper[4717]: I0221 22:23:29.149778 4717 generic.go:334] "Generic (PLEG): container finished" podID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerID="fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116" exitCode=0 Feb 21 22:23:29 crc kubenswrapper[4717]: I0221 22:23:29.149895 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerDied","Data":"fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116"} Feb 21 22:23:29 crc kubenswrapper[4717]: I0221 22:23:29.153239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerStarted","Data":"0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980"} Feb 21 22:23:30 crc kubenswrapper[4717]: I0221 22:23:30.167093 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerStarted","Data":"e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e"} Feb 21 22:23:30 crc kubenswrapper[4717]: I0221 22:23:30.169636 4717 generic.go:334] "Generic (PLEG): container finished" podID="795b212f-ba82-4863-932e-10b9de78ac4e" containerID="0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980" exitCode=0 Feb 21 22:23:30 crc kubenswrapper[4717]: I0221 22:23:30.169712 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerDied","Data":"0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980"} Feb 21 22:23:30 crc kubenswrapper[4717]: I0221 22:23:30.193289 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8xdd" podStartSLOduration=2.374187224 podStartE2EDuration="6.193271544s" podCreationTimestamp="2026-02-21 22:23:24 +0000 UTC" firstStartedPulling="2026-02-21 22:23:26.116880913 +0000 UTC m=+2220.898414535" lastFinishedPulling="2026-02-21 22:23:29.935965223 +0000 UTC m=+2224.717498855" observedRunningTime="2026-02-21 22:23:30.183891561 +0000 UTC m=+2224.965425193" watchObservedRunningTime="2026-02-21 22:23:30.193271544 +0000 UTC m=+2224.974805166" Feb 21 22:23:30 crc kubenswrapper[4717]: I0221 22:23:30.977443 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:23:30 crc kubenswrapper[4717]: E0221 22:23:30.977747 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:23:31 crc kubenswrapper[4717]: I0221 22:23:31.182586 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerStarted","Data":"da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a"} Feb 21 22:23:32 crc kubenswrapper[4717]: I0221 22:23:32.192874 4717 generic.go:334] "Generic (PLEG): container finished" podID="795b212f-ba82-4863-932e-10b9de78ac4e" containerID="da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a" exitCode=0 Feb 21 22:23:32 crc kubenswrapper[4717]: I0221 22:23:32.192949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerDied","Data":"da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a"} Feb 21 22:23:33 crc kubenswrapper[4717]: I0221 22:23:33.205512 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerStarted","Data":"6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e"} Feb 21 22:23:34 crc kubenswrapper[4717]: I0221 22:23:34.992291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:34 crc kubenswrapper[4717]: I0221 22:23:34.993122 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:36 crc kubenswrapper[4717]: I0221 22:23:36.043034 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8xdd" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="registry-server" probeResult="failure" output=< Feb 21 22:23:36 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 22:23:36 crc kubenswrapper[4717]: > Feb 21 22:23:37 crc kubenswrapper[4717]: I0221 22:23:37.345105 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:37 crc kubenswrapper[4717]: I0221 22:23:37.345181 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:37 crc kubenswrapper[4717]: I0221 22:23:37.400837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:37 crc kubenswrapper[4717]: I0221 22:23:37.433233 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l9zw8" podStartSLOduration=9.039777717 podStartE2EDuration="11.433207524s" podCreationTimestamp="2026-02-21 22:23:26 +0000 UTC" firstStartedPulling="2026-02-21 22:23:30.17088423 +0000 UTC m=+2224.952417852" lastFinishedPulling="2026-02-21 22:23:32.564313997 +0000 UTC m=+2227.345847659" observedRunningTime="2026-02-21 22:23:33.237487689 +0000 UTC m=+2228.019021311" watchObservedRunningTime="2026-02-21 22:23:37.433207524 +0000 UTC m=+2232.214741156" Feb 21 22:23:38 crc kubenswrapper[4717]: I0221 22:23:38.326111 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:38 crc kubenswrapper[4717]: I0221 22:23:38.388627 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9zw8"] Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.288604 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l9zw8" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="registry-server" containerID="cri-o://6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e" gracePeriod=2 Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.821336 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.883222 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5kq\" (UniqueName: \"kubernetes.io/projected/795b212f-ba82-4863-932e-10b9de78ac4e-kube-api-access-rv5kq\") pod \"795b212f-ba82-4863-932e-10b9de78ac4e\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.883431 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-catalog-content\") pod \"795b212f-ba82-4863-932e-10b9de78ac4e\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.883475 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-utilities\") pod \"795b212f-ba82-4863-932e-10b9de78ac4e\" (UID: \"795b212f-ba82-4863-932e-10b9de78ac4e\") " Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.884442 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-utilities" (OuterVolumeSpecName: "utilities") pod "795b212f-ba82-4863-932e-10b9de78ac4e" (UID: "795b212f-ba82-4863-932e-10b9de78ac4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.884838 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.893947 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795b212f-ba82-4863-932e-10b9de78ac4e-kube-api-access-rv5kq" (OuterVolumeSpecName: "kube-api-access-rv5kq") pod "795b212f-ba82-4863-932e-10b9de78ac4e" (UID: "795b212f-ba82-4863-932e-10b9de78ac4e"). InnerVolumeSpecName "kube-api-access-rv5kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.949384 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "795b212f-ba82-4863-932e-10b9de78ac4e" (UID: "795b212f-ba82-4863-932e-10b9de78ac4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.988670 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/795b212f-ba82-4863-932e-10b9de78ac4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:23:40 crc kubenswrapper[4717]: I0221 22:23:40.988723 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv5kq\" (UniqueName: \"kubernetes.io/projected/795b212f-ba82-4863-932e-10b9de78ac4e-kube-api-access-rv5kq\") on node \"crc\" DevicePath \"\"" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.300579 4717 generic.go:334] "Generic (PLEG): container finished" podID="795b212f-ba82-4863-932e-10b9de78ac4e" containerID="6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e" exitCode=0 Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.300647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerDied","Data":"6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e"} Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.300727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9zw8" event={"ID":"795b212f-ba82-4863-932e-10b9de78ac4e","Type":"ContainerDied","Data":"c95553e650db78edec1ada08c5d7f5222e5baab342a583906685ff31237dcd17"} Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.300658 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9zw8" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.300753 4717 scope.go:117] "RemoveContainer" containerID="6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.322629 4717 scope.go:117] "RemoveContainer" containerID="da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.353647 4717 scope.go:117] "RemoveContainer" containerID="0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.370576 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9zw8"] Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.388917 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l9zw8"] Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.424342 4717 scope.go:117] "RemoveContainer" containerID="6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e" Feb 21 22:23:41 crc kubenswrapper[4717]: E0221 22:23:41.425310 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e\": container with ID starting with 6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e not found: ID does not exist" containerID="6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.425369 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e"} err="failed to get container status \"6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e\": rpc error: code = NotFound desc = could not find container \"6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e\": container with ID starting with 6e566d4f927bfee3d38c226aec84ae328c476976fa8ebf450365ca1d9cb8a47e not found: ID does not exist" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.425400 4717 scope.go:117] "RemoveContainer" containerID="da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a" Feb 21 22:23:41 crc kubenswrapper[4717]: E0221 22:23:41.425798 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a\": container with ID starting with da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a not found: ID does not exist" containerID="da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.425855 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a"} err="failed to get container status \"da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a\": rpc error: code = NotFound desc = could not find container \"da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a\": container with ID starting with da1d34382a544f639f2e8d3ed30b52e3090494e4b0222078039ab37bf244512a not found: ID does not exist" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.425916 4717 scope.go:117] "RemoveContainer" containerID="0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980" Feb 21 22:23:41 crc kubenswrapper[4717]: E0221 22:23:41.426248 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980\": container with ID starting with 0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980 not found: ID does not exist" containerID="0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980" Feb 21 22:23:41 crc kubenswrapper[4717]: I0221 22:23:41.426296 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980"} err="failed to get container status \"0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980\": rpc error: code = NotFound desc = could not find container \"0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980\": container with ID starting with 0354b7e26cfc2b736b130f48fd280dfce6d96adbdc0a01fe7f4a94826876b980 not found: ID does not exist" Feb 21 22:23:42 crc kubenswrapper[4717]: I0221 22:23:42.012504 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" path="/var/lib/kubelet/pods/795b212f-ba82-4863-932e-10b9de78ac4e/volumes" Feb 21 22:23:42 crc kubenswrapper[4717]: I0221 22:23:42.977988 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:23:42 crc kubenswrapper[4717]: E0221 22:23:42.978397 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:23:45 crc kubenswrapper[4717]: I0221 22:23:45.066144 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:45 crc kubenswrapper[4717]: I0221 22:23:45.149576 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:45 crc kubenswrapper[4717]: I0221 22:23:45.321401 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8xdd"] Feb 21 22:23:46 crc kubenswrapper[4717]: I0221 22:23:46.394903 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s8xdd" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="registry-server" containerID="cri-o://e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e" gracePeriod=2 Feb 21 22:23:46 crc kubenswrapper[4717]: I0221 22:23:46.880955 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.021457 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-catalog-content\") pod \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.021592 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfpdx\" (UniqueName: \"kubernetes.io/projected/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-kube-api-access-bfpdx\") pod \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.021624 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-utilities\") pod \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\" (UID: \"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2\") " Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.023197 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-utilities" (OuterVolumeSpecName: "utilities") pod "50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" (UID: "50ab6f80-275f-4d97-b3aa-f9f9932bd3f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.027374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-kube-api-access-bfpdx" (OuterVolumeSpecName: "kube-api-access-bfpdx") pod "50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" (UID: "50ab6f80-275f-4d97-b3aa-f9f9932bd3f2"). InnerVolumeSpecName "kube-api-access-bfpdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.124119 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfpdx\" (UniqueName: \"kubernetes.io/projected/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-kube-api-access-bfpdx\") on node \"crc\" DevicePath \"\"" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.124159 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.152460 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" (UID: "50ab6f80-275f-4d97-b3aa-f9f9932bd3f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.226212 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.408157 4717 generic.go:334] "Generic (PLEG): container finished" podID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerID="e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e" exitCode=0 Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.408193 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerDied","Data":"e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e"} Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.408239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8xdd" event={"ID":"50ab6f80-275f-4d97-b3aa-f9f9932bd3f2","Type":"ContainerDied","Data":"49f8f0f2e4517d72885a7bb507d8a06c68f2db6a51c708db8e553122ef4164ad"} Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.408261 4717 scope.go:117] "RemoveContainer" containerID="e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.408280 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8xdd" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.437091 4717 scope.go:117] "RemoveContainer" containerID="fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.477519 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8xdd"] Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.487166 4717 scope.go:117] "RemoveContainer" containerID="370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.490467 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s8xdd"] Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.528843 4717 scope.go:117] "RemoveContainer" containerID="e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e" Feb 21 22:23:47 crc kubenswrapper[4717]: E0221 22:23:47.529483 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e\": container with ID starting with e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e not found: ID does not exist" containerID="e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.529556 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e"} err="failed to get container status \"e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e\": rpc error: code = NotFound desc = could not find container \"e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e\": container with ID starting with e815f6f7f5bb65646f364034d544b78f1d81636364dd766b21671600259e751e not found: ID does not exist" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.529604 4717 scope.go:117] "RemoveContainer" containerID="fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116" Feb 21 22:23:47 crc kubenswrapper[4717]: E0221 22:23:47.530675 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116\": container with ID starting with fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116 not found: ID does not exist" containerID="fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.531976 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116"} err="failed to get container status \"fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116\": rpc error: code = NotFound desc = could not find container \"fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116\": container with ID starting with fbdaa7abb6c21e0293d3a3626fa2e4610be5cdb195da1ef22b81bceb6beb1116 not found: ID does not exist" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.532023 4717 scope.go:117] "RemoveContainer" containerID="370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a" Feb 21 22:23:47 crc kubenswrapper[4717]: E0221 22:23:47.532418 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a\": container with ID starting with 370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a not found: ID does not exist" containerID="370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.532463 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a"} err="failed to get container status \"370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a\": rpc error: code = NotFound desc = could not find container \"370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a\": container with ID starting with 370a9e9e0b45a9ff44a7aaae1a2dd73001c36639a68f5e43629a408b0515c96a not found: ID does not exist" Feb 21 22:23:47 crc kubenswrapper[4717]: I0221 22:23:47.994748 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" path="/var/lib/kubelet/pods/50ab6f80-275f-4d97-b3aa-f9f9932bd3f2/volumes" Feb 21 22:23:56 crc kubenswrapper[4717]: I0221 22:23:56.977236 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:23:56 crc kubenswrapper[4717]: E0221 22:23:56.979141 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:24:10 crc kubenswrapper[4717]: I0221 22:24:10.976592 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:24:10 crc kubenswrapper[4717]: E0221 22:24:10.977499 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:24:23 crc kubenswrapper[4717]: I0221 22:24:23.985420 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:24:23 crc kubenswrapper[4717]: E0221 22:24:23.987453 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:24:36 crc kubenswrapper[4717]: I0221 22:24:36.977375 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:24:36 crc kubenswrapper[4717]: E0221 22:24:36.978629 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:24:49 crc kubenswrapper[4717]: I0221 22:24:49.976433 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:24:49 crc kubenswrapper[4717]: E0221 22:24:49.977317 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:25:00 crc kubenswrapper[4717]: I0221 22:25:00.984573 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:25:00 crc kubenswrapper[4717]: E0221 22:25:00.986124 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:25:13 crc kubenswrapper[4717]: I0221 22:25:13.976585 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:25:13 crc kubenswrapper[4717]: E0221 22:25:13.977425 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:25:21 crc kubenswrapper[4717]: I0221 22:25:21.429706 4717 generic.go:334] "Generic (PLEG): container finished" podID="0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" containerID="1f4b7c01ce6e277a4f2c79c13a0743cc80c852078cf7001aef439cd25ef0e91f" exitCode=0 Feb 21 22:25:21 crc kubenswrapper[4717]: I0221 22:25:21.429844 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" event={"ID":"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c","Type":"ContainerDied","Data":"1f4b7c01ce6e277a4f2c79c13a0743cc80c852078cf7001aef439cd25ef0e91f"} Feb 21 22:25:22 crc kubenswrapper[4717]: I0221 22:25:22.971459 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057375 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-0\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057463 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-0\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-1\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057615 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-extra-config-0\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-2\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057661 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-combined-ca-bundle\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057694 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-3\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057835 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-inventory\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057874 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-1\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057905 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-ssh-key-openstack-edpm-ipam\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.057924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9bm9\" (UniqueName: \"kubernetes.io/projected/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-kube-api-access-m9bm9\") pod \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\" (UID: \"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c\") " Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.065087 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-kube-api-access-m9bm9" (OuterVolumeSpecName: "kube-api-access-m9bm9") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "kube-api-access-m9bm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.069344 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.089192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-inventory" (OuterVolumeSpecName: "inventory") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.089721 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.090607 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.091380 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.098079 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.101504 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.112542 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.114259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.115454 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" (UID: "0fa98805-0ef5-463d-9ae3-1a66efcb9b0c"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.159788 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160158 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160241 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160312 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9bm9\" (UniqueName: \"kubernetes.io/projected/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-kube-api-access-m9bm9\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160367 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160439 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160513 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160587 4717 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160693 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160793 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.160905 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0fa98805-0ef5-463d-9ae3-1a66efcb9b0c-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.448206 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" event={"ID":"0fa98805-0ef5-463d-9ae3-1a66efcb9b0c","Type":"ContainerDied","Data":"01eb37659c2aa252422fe3bd6e58068083af986794fe427214229fd1a2bc28c2"} Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.448252 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01eb37659c2aa252422fe3bd6e58068083af986794fe427214229fd1a2bc28c2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.448276 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7l5ng" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.637799 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2"] Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638288 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="extract-utilities" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638312 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="extract-utilities" Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638337 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="extract-content" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638346 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="extract-content" Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638357 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="extract-content" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638365 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="extract-content" Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638379 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="registry-server" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638386 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="registry-server" Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638407 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="registry-server" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638416 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="registry-server" Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638439 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="extract-utilities" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638446 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="extract-utilities" Feb 21 22:25:23 crc kubenswrapper[4717]: E0221 22:25:23.638463 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638469 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638686 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ab6f80-275f-4d97-b3aa-f9f9932bd3f2" containerName="registry-server" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638709 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa98805-0ef5-463d-9ae3-1a66efcb9b0c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.638722 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="795b212f-ba82-4863-932e-10b9de78ac4e" containerName="registry-server" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.639476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.642485 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.642657 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.642666 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4hd2s" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.642932 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.646575 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.664137 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2"] Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.670449 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.670541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.670607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.670643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc97m\" (UniqueName: \"kubernetes.io/projected/6890db4e-d63d-4b19-87aa-b5186b85ece1-kube-api-access-cc97m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.670837 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.670992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.671068 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772615 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772720 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc97m\" (UniqueName: \"kubernetes.io/projected/6890db4e-d63d-4b19-87aa-b5186b85ece1-kube-api-access-cc97m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.772767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.776786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.777262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.777553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.777750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.778287 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.778593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.792995 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc97m\" (UniqueName: \"kubernetes.io/projected/6890db4e-d63d-4b19-87aa-b5186b85ece1-kube-api-access-cc97m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:23 crc kubenswrapper[4717]: I0221 22:25:23.972286 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:25:24 crc kubenswrapper[4717]: I0221 22:25:24.396472 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2"] Feb 21 22:25:24 crc kubenswrapper[4717]: W0221 22:25:24.408114 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6890db4e_d63d_4b19_87aa_b5186b85ece1.slice/crio-c8b9a3ae8fd2675393e558f3933fa635d90ff7efececfc9ac97c8c48b8b1f5c2 WatchSource:0}: Error finding container c8b9a3ae8fd2675393e558f3933fa635d90ff7efececfc9ac97c8c48b8b1f5c2: Status 404 returned error can't find the container with id c8b9a3ae8fd2675393e558f3933fa635d90ff7efececfc9ac97c8c48b8b1f5c2 Feb 21 22:25:24 crc kubenswrapper[4717]: I0221 22:25:24.457528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" event={"ID":"6890db4e-d63d-4b19-87aa-b5186b85ece1","Type":"ContainerStarted","Data":"c8b9a3ae8fd2675393e558f3933fa635d90ff7efececfc9ac97c8c48b8b1f5c2"} Feb 21 22:25:24 crc kubenswrapper[4717]: I0221 22:25:24.977511 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:25:24 crc kubenswrapper[4717]: E0221 22:25:24.978479 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:25:25 crc kubenswrapper[4717]: I0221 22:25:25.470902 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" event={"ID":"6890db4e-d63d-4b19-87aa-b5186b85ece1","Type":"ContainerStarted","Data":"17fe7d368ffc90b8badf611bd2f852e213bbe12460dfb056710a5f5ebcafe941"} Feb 21 22:25:25 crc kubenswrapper[4717]: I0221 22:25:25.501634 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" podStartSLOduration=2.04944781 podStartE2EDuration="2.501613304s" podCreationTimestamp="2026-02-21 22:25:23 +0000 UTC" firstStartedPulling="2026-02-21 22:25:24.409980021 +0000 UTC m=+2339.191513633" lastFinishedPulling="2026-02-21 22:25:24.862145465 +0000 UTC m=+2339.643679127" observedRunningTime="2026-02-21 22:25:25.494821242 +0000 UTC m=+2340.276354864" watchObservedRunningTime="2026-02-21 22:25:25.501613304 +0000 UTC m=+2340.283146926" Feb 21 22:25:39 crc kubenswrapper[4717]: I0221 22:25:39.977842 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:25:39 crc kubenswrapper[4717]: E0221 22:25:39.978695 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:25:50 crc kubenswrapper[4717]: I0221 22:25:50.978243 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:25:50 crc kubenswrapper[4717]: E0221 22:25:50.979796 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:26:02 crc kubenswrapper[4717]: I0221 22:26:02.977162 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:26:02 crc kubenswrapper[4717]: E0221 22:26:02.978376 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:26:13 crc kubenswrapper[4717]: I0221 22:26:13.976225 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:26:13 crc kubenswrapper[4717]: E0221 22:26:13.977119 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:26:26 crc kubenswrapper[4717]: I0221 22:26:26.977245 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:26:26 crc kubenswrapper[4717]: E0221 22:26:26.978450 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:26:39 crc kubenswrapper[4717]: I0221 22:26:39.977001 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:26:39 crc kubenswrapper[4717]: E0221 22:26:39.978132 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:26:54 crc kubenswrapper[4717]: I0221 22:26:54.976977 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:26:54 crc kubenswrapper[4717]: E0221 22:26:54.978120 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:27:05 crc kubenswrapper[4717]: I0221 22:27:05.990083 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:27:05 crc kubenswrapper[4717]: E0221 22:27:05.991249 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:27:16 crc kubenswrapper[4717]: I0221 22:27:16.976697 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:27:16 crc kubenswrapper[4717]: E0221 22:27:16.977515 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:27:28 crc kubenswrapper[4717]: I0221 22:27:28.976080 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:27:28 crc kubenswrapper[4717]: E0221 22:27:28.977016 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:27:39 crc kubenswrapper[4717]: I0221 22:27:39.977836 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:27:40 crc kubenswrapper[4717]: I0221 22:27:40.955337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"9d91e1e907ed42ee54b189c1b4dbcc4ded12cabfe2d35ffcc00d8405cf56fc50"} Feb 21 22:28:05 crc kubenswrapper[4717]: I0221 22:28:05.231658 4717 generic.go:334] "Generic (PLEG): container finished" podID="6890db4e-d63d-4b19-87aa-b5186b85ece1" containerID="17fe7d368ffc90b8badf611bd2f852e213bbe12460dfb056710a5f5ebcafe941" exitCode=0 Feb 21 22:28:05 crc kubenswrapper[4717]: I0221 22:28:05.231769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" event={"ID":"6890db4e-d63d-4b19-87aa-b5186b85ece1","Type":"ContainerDied","Data":"17fe7d368ffc90b8badf611bd2f852e213bbe12460dfb056710a5f5ebcafe941"} Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.753025 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ssh-key-openstack-edpm-ipam\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-1\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-0\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814580 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-inventory\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814622 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-2\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814663 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc97m\" (UniqueName: \"kubernetes.io/projected/6890db4e-d63d-4b19-87aa-b5186b85ece1-kube-api-access-cc97m\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.814690 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-telemetry-combined-ca-bundle\") pod \"6890db4e-d63d-4b19-87aa-b5186b85ece1\" (UID: \"6890db4e-d63d-4b19-87aa-b5186b85ece1\") " Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.838935 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.839484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6890db4e-d63d-4b19-87aa-b5186b85ece1-kube-api-access-cc97m" (OuterVolumeSpecName: "kube-api-access-cc97m") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "kube-api-access-cc97m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.844671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-inventory" (OuterVolumeSpecName: "inventory") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.856687 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.871709 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.875613 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.882624 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "6890db4e-d63d-4b19-87aa-b5186b85ece1" (UID: "6890db4e-d63d-4b19-87aa-b5186b85ece1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918291 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918367 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc97m\" (UniqueName: \"kubernetes.io/projected/6890db4e-d63d-4b19-87aa-b5186b85ece1-kube-api-access-cc97m\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918392 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918422 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918448 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918475 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:06 crc kubenswrapper[4717]: I0221 22:28:06.918495 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6890db4e-d63d-4b19-87aa-b5186b85ece1-inventory\") on node \"crc\" DevicePath \"\"" Feb 21 22:28:07 crc kubenswrapper[4717]: I0221 22:28:07.257054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" event={"ID":"6890db4e-d63d-4b19-87aa-b5186b85ece1","Type":"ContainerDied","Data":"c8b9a3ae8fd2675393e558f3933fa635d90ff7efececfc9ac97c8c48b8b1f5c2"} Feb 21 22:28:07 crc kubenswrapper[4717]: I0221 22:28:07.257098 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b9a3ae8fd2675393e558f3933fa635d90ff7efececfc9ac97c8c48b8b1f5c2" Feb 21 22:28:07 crc kubenswrapper[4717]: I0221 22:28:07.257149 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.271439 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 21 22:29:02 crc kubenswrapper[4717]: E0221 22:29:02.272444 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6890db4e-d63d-4b19-87aa-b5186b85ece1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.272465 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6890db4e-d63d-4b19-87aa-b5186b85ece1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.272735 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6890db4e-d63d-4b19-87aa-b5186b85ece1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.273466 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.277197 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.277731 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nxzb2" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.278000 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.278015 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.287812 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371053 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371168 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371272 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371303 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ltm8\" (UniqueName: \"kubernetes.io/projected/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-kube-api-access-6ltm8\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371354 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-config-data\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.371383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.473724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.473798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.473854 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ltm8\" (UniqueName: \"kubernetes.io/projected/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-kube-api-access-6ltm8\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.473972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-config-data\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474024 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474064 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474120 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474251 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.474722 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.475236 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.476489 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.478244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-config-data\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.481535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.482050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.493721 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.501372 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ltm8\" (UniqueName: \"kubernetes.io/projected/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-kube-api-access-6ltm8\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.507180 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " pod="openstack/tempest-tests-tempest" Feb 21 22:29:02 crc kubenswrapper[4717]: I0221 22:29:02.621835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 21 22:29:03 crc kubenswrapper[4717]: I0221 22:29:03.131508 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 21 22:29:03 crc kubenswrapper[4717]: I0221 22:29:03.137951 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:29:03 crc kubenswrapper[4717]: I0221 22:29:03.883261 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35","Type":"ContainerStarted","Data":"afd8bbf3f0d5e3b0a2a56a30fc133c9e2060a62380c7b8a8441b9792417cb82a"} Feb 21 22:29:30 crc kubenswrapper[4717]: I0221 22:29:30.598177 4717 scope.go:117] "RemoveContainer" containerID="b4c46acdfef739bef3c12893d005fb1227e9de1d5604b3665fca46fd59620e02" Feb 21 22:29:36 crc kubenswrapper[4717]: I0221 22:29:36.439988 4717 scope.go:117] "RemoveContainer" containerID="eb340ed774ffdad49991eab943abc288725cf920deec14a0b8ea2c25b4f0a658" Feb 21 22:29:36 crc kubenswrapper[4717]: I0221 22:29:36.496159 4717 scope.go:117] "RemoveContainer" containerID="161791f11f8205641e34caad4bfaea0f67b96339a063e306d9823f2e86919bcb" Feb 21 22:29:36 crc kubenswrapper[4717]: E0221 22:29:36.505319 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 21 22:29:36 crc kubenswrapper[4717]: E0221 22:29:36.505552 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ltm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 21 22:29:36 crc kubenswrapper[4717]: E0221 22:29:36.506742 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" Feb 21 22:29:37 crc kubenswrapper[4717]: E0221 22:29:37.216154 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" Feb 21 22:29:48 crc kubenswrapper[4717]: I0221 22:29:48.466188 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 21 22:29:50 crc kubenswrapper[4717]: I0221 22:29:50.364620 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35","Type":"ContainerStarted","Data":"60a6b28a7b2140fa174fbdae0254a5e430559ecadfe34bc38cee37fd5986dcdb"} Feb 21 22:29:50 crc kubenswrapper[4717]: I0221 22:29:50.391092 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.0652521010000005 podStartE2EDuration="49.391070843s" podCreationTimestamp="2026-02-21 22:29:01 +0000 UTC" firstStartedPulling="2026-02-21 22:29:03.137714341 +0000 UTC m=+2557.919247963" lastFinishedPulling="2026-02-21 22:29:48.463533073 +0000 UTC m=+2603.245066705" observedRunningTime="2026-02-21 22:29:50.386212128 +0000 UTC m=+2605.167745790" watchObservedRunningTime="2026-02-21 22:29:50.391070843 +0000 UTC m=+2605.172604465" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.153400 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9"] Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.155712 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.158504 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.158720 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.165633 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9"] Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.330375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62l9h\" (UniqueName: \"kubernetes.io/projected/24acc183-4af6-4b0c-adbb-47c2d867511a-kube-api-access-62l9h\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.330567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24acc183-4af6-4b0c-adbb-47c2d867511a-secret-volume\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.330777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24acc183-4af6-4b0c-adbb-47c2d867511a-config-volume\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.432551 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62l9h\" (UniqueName: \"kubernetes.io/projected/24acc183-4af6-4b0c-adbb-47c2d867511a-kube-api-access-62l9h\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.432635 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24acc183-4af6-4b0c-adbb-47c2d867511a-secret-volume\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.432694 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24acc183-4af6-4b0c-adbb-47c2d867511a-config-volume\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.433981 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24acc183-4af6-4b0c-adbb-47c2d867511a-config-volume\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.445899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24acc183-4af6-4b0c-adbb-47c2d867511a-secret-volume\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.457834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62l9h\" (UniqueName: \"kubernetes.io/projected/24acc183-4af6-4b0c-adbb-47c2d867511a-kube-api-access-62l9h\") pod \"collect-profiles-29528550-n2pw9\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.482652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:00 crc kubenswrapper[4717]: I0221 22:30:00.989014 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9"] Feb 21 22:30:01 crc kubenswrapper[4717]: I0221 22:30:01.469600 4717 generic.go:334] "Generic (PLEG): container finished" podID="24acc183-4af6-4b0c-adbb-47c2d867511a" containerID="0e222e40b34fdc4c582633bc69eefcdbc56eca5d137d6c2a148d07d93ad47a04" exitCode=0 Feb 21 22:30:01 crc kubenswrapper[4717]: I0221 22:30:01.469737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" event={"ID":"24acc183-4af6-4b0c-adbb-47c2d867511a","Type":"ContainerDied","Data":"0e222e40b34fdc4c582633bc69eefcdbc56eca5d137d6c2a148d07d93ad47a04"} Feb 21 22:30:01 crc kubenswrapper[4717]: I0221 22:30:01.469958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" event={"ID":"24acc183-4af6-4b0c-adbb-47c2d867511a","Type":"ContainerStarted","Data":"72005f9daea98d21353b50f8bdfbcb0703d71ef37d0ce5eaa1ca62f3020a6224"} Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.853173 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.980642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24acc183-4af6-4b0c-adbb-47c2d867511a-secret-volume\") pod \"24acc183-4af6-4b0c-adbb-47c2d867511a\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.980765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24acc183-4af6-4b0c-adbb-47c2d867511a-config-volume\") pod \"24acc183-4af6-4b0c-adbb-47c2d867511a\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.980831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62l9h\" (UniqueName: \"kubernetes.io/projected/24acc183-4af6-4b0c-adbb-47c2d867511a-kube-api-access-62l9h\") pod \"24acc183-4af6-4b0c-adbb-47c2d867511a\" (UID: \"24acc183-4af6-4b0c-adbb-47c2d867511a\") " Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.981926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24acc183-4af6-4b0c-adbb-47c2d867511a-config-volume" (OuterVolumeSpecName: "config-volume") pod "24acc183-4af6-4b0c-adbb-47c2d867511a" (UID: "24acc183-4af6-4b0c-adbb-47c2d867511a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.987924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24acc183-4af6-4b0c-adbb-47c2d867511a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "24acc183-4af6-4b0c-adbb-47c2d867511a" (UID: "24acc183-4af6-4b0c-adbb-47c2d867511a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:30:02 crc kubenswrapper[4717]: I0221 22:30:02.988060 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24acc183-4af6-4b0c-adbb-47c2d867511a-kube-api-access-62l9h" (OuterVolumeSpecName: "kube-api-access-62l9h") pod "24acc183-4af6-4b0c-adbb-47c2d867511a" (UID: "24acc183-4af6-4b0c-adbb-47c2d867511a"). InnerVolumeSpecName "kube-api-access-62l9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.083788 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24acc183-4af6-4b0c-adbb-47c2d867511a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.083833 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24acc183-4af6-4b0c-adbb-47c2d867511a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.083847 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62l9h\" (UniqueName: \"kubernetes.io/projected/24acc183-4af6-4b0c-adbb-47c2d867511a-kube-api-access-62l9h\") on node \"crc\" DevicePath \"\"" Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.494853 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" event={"ID":"24acc183-4af6-4b0c-adbb-47c2d867511a","Type":"ContainerDied","Data":"72005f9daea98d21353b50f8bdfbcb0703d71ef37d0ce5eaa1ca62f3020a6224"} Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.494919 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72005f9daea98d21353b50f8bdfbcb0703d71ef37d0ce5eaa1ca62f3020a6224" Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.494938 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528550-n2pw9" Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.964815 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv"] Feb 21 22:30:03 crc kubenswrapper[4717]: I0221 22:30:03.974294 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528505-5wtqv"] Feb 21 22:30:04 crc kubenswrapper[4717]: I0221 22:30:04.025340 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="450eb855-2d6d-4503-9a7e-1980d8e97346" path="/var/lib/kubelet/pods/450eb855-2d6d-4503-9a7e-1980d8e97346/volumes" Feb 21 22:30:09 crc kubenswrapper[4717]: I0221 22:30:09.062402 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:30:09 crc kubenswrapper[4717]: I0221 22:30:09.064593 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:30:36 crc kubenswrapper[4717]: I0221 22:30:36.592719 4717 scope.go:117] "RemoveContainer" containerID="f168c1a45fac9a09bf48c9e2e1de86c5f6eaab1f1265501af4315869e6fcb1dc" Feb 21 22:30:39 crc kubenswrapper[4717]: I0221 22:30:39.062306 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:30:39 crc kubenswrapper[4717]: I0221 22:30:39.062986 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.062349 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.062972 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.063056 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.064107 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d91e1e907ed42ee54b189c1b4dbcc4ded12cabfe2d35ffcc00d8405cf56fc50"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.064238 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://9d91e1e907ed42ee54b189c1b4dbcc4ded12cabfe2d35ffcc00d8405cf56fc50" gracePeriod=600 Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.253681 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="9d91e1e907ed42ee54b189c1b4dbcc4ded12cabfe2d35ffcc00d8405cf56fc50" exitCode=0 Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.253911 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"9d91e1e907ed42ee54b189c1b4dbcc4ded12cabfe2d35ffcc00d8405cf56fc50"} Feb 21 22:31:09 crc kubenswrapper[4717]: I0221 22:31:09.253999 4717 scope.go:117] "RemoveContainer" containerID="9ae968d4b2262aa4078430e304c83212aeaa32ee1955f472931deaf1096c8642" Feb 21 22:31:10 crc kubenswrapper[4717]: I0221 22:31:10.268404 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674"} Feb 21 22:33:09 crc kubenswrapper[4717]: I0221 22:33:09.062309 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:33:09 crc kubenswrapper[4717]: I0221 22:33:09.063074 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.061307 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxnc"] Feb 21 22:33:28 crc kubenswrapper[4717]: E0221 22:33:28.062450 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24acc183-4af6-4b0c-adbb-47c2d867511a" containerName="collect-profiles" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.062474 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="24acc183-4af6-4b0c-adbb-47c2d867511a" containerName="collect-profiles" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.062921 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="24acc183-4af6-4b0c-adbb-47c2d867511a" containerName="collect-profiles" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.065280 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.082255 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxnc"] Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.174470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvsn\" (UniqueName: \"kubernetes.io/projected/4c040078-5a12-407b-9263-4fdc6967c379-kube-api-access-zkvsn\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.174554 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-utilities\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.174680 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-catalog-content\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.276358 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-catalog-content\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.276485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvsn\" (UniqueName: \"kubernetes.io/projected/4c040078-5a12-407b-9263-4fdc6967c379-kube-api-access-zkvsn\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.276530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-utilities\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.276982 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-catalog-content\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.277009 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-utilities\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.299954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvsn\" (UniqueName: \"kubernetes.io/projected/4c040078-5a12-407b-9263-4fdc6967c379-kube-api-access-zkvsn\") pod \"redhat-marketplace-wqxnc\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.397567 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:28 crc kubenswrapper[4717]: I0221 22:33:28.851008 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxnc"] Feb 21 22:33:29 crc kubenswrapper[4717]: I0221 22:33:29.722070 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c040078-5a12-407b-9263-4fdc6967c379" containerID="6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7" exitCode=0 Feb 21 22:33:29 crc kubenswrapper[4717]: I0221 22:33:29.722146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerDied","Data":"6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7"} Feb 21 22:33:29 crc kubenswrapper[4717]: I0221 22:33:29.722188 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerStarted","Data":"e73f1e98dde2610dcb19c38e4189de9994486fd6086249d768d02ec6c23db06d"} Feb 21 22:33:30 crc kubenswrapper[4717]: I0221 22:33:30.740227 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerStarted","Data":"27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee"} Feb 21 22:33:31 crc kubenswrapper[4717]: I0221 22:33:31.752923 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c040078-5a12-407b-9263-4fdc6967c379" containerID="27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee" exitCode=0 Feb 21 22:33:31 crc kubenswrapper[4717]: I0221 22:33:31.753011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerDied","Data":"27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee"} Feb 21 22:33:32 crc kubenswrapper[4717]: I0221 22:33:32.766900 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerStarted","Data":"5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21"} Feb 21 22:33:32 crc kubenswrapper[4717]: I0221 22:33:32.784361 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wqxnc" podStartSLOduration=2.378541477 podStartE2EDuration="4.784327315s" podCreationTimestamp="2026-02-21 22:33:28 +0000 UTC" firstStartedPulling="2026-02-21 22:33:29.724959704 +0000 UTC m=+2824.506493366" lastFinishedPulling="2026-02-21 22:33:32.130745582 +0000 UTC m=+2826.912279204" observedRunningTime="2026-02-21 22:33:32.784317525 +0000 UTC m=+2827.565851167" watchObservedRunningTime="2026-02-21 22:33:32.784327315 +0000 UTC m=+2827.565860977" Feb 21 22:33:38 crc kubenswrapper[4717]: I0221 22:33:38.398070 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:38 crc kubenswrapper[4717]: I0221 22:33:38.398538 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:38 crc kubenswrapper[4717]: I0221 22:33:38.466618 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:38 crc kubenswrapper[4717]: I0221 22:33:38.880841 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:38 crc kubenswrapper[4717]: I0221 22:33:38.942583 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxnc"] Feb 21 22:33:39 crc kubenswrapper[4717]: I0221 22:33:39.062519 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:33:39 crc kubenswrapper[4717]: I0221 22:33:39.062589 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:33:40 crc kubenswrapper[4717]: I0221 22:33:40.850661 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wqxnc" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="registry-server" containerID="cri-o://5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21" gracePeriod=2 Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.386170 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.446758 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-utilities\") pod \"4c040078-5a12-407b-9263-4fdc6967c379\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.446835 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvsn\" (UniqueName: \"kubernetes.io/projected/4c040078-5a12-407b-9263-4fdc6967c379-kube-api-access-zkvsn\") pod \"4c040078-5a12-407b-9263-4fdc6967c379\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.446920 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-catalog-content\") pod \"4c040078-5a12-407b-9263-4fdc6967c379\" (UID: \"4c040078-5a12-407b-9263-4fdc6967c379\") " Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.448544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-utilities" (OuterVolumeSpecName: "utilities") pod "4c040078-5a12-407b-9263-4fdc6967c379" (UID: "4c040078-5a12-407b-9263-4fdc6967c379"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.464100 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c040078-5a12-407b-9263-4fdc6967c379-kube-api-access-zkvsn" (OuterVolumeSpecName: "kube-api-access-zkvsn") pod "4c040078-5a12-407b-9263-4fdc6967c379" (UID: "4c040078-5a12-407b-9263-4fdc6967c379"). InnerVolumeSpecName "kube-api-access-zkvsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.505213 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c040078-5a12-407b-9263-4fdc6967c379" (UID: "4c040078-5a12-407b-9263-4fdc6967c379"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.548788 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.548818 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvsn\" (UniqueName: \"kubernetes.io/projected/4c040078-5a12-407b-9263-4fdc6967c379-kube-api-access-zkvsn\") on node \"crc\" DevicePath \"\"" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.548830 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c040078-5a12-407b-9263-4fdc6967c379-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.864815 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c040078-5a12-407b-9263-4fdc6967c379" containerID="5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21" exitCode=0 Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.864957 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerDied","Data":"5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21"} Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.865002 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wqxnc" event={"ID":"4c040078-5a12-407b-9263-4fdc6967c379","Type":"ContainerDied","Data":"e73f1e98dde2610dcb19c38e4189de9994486fd6086249d768d02ec6c23db06d"} Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.865036 4717 scope.go:117] "RemoveContainer" containerID="5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.865136 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wqxnc" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.908726 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxnc"] Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.909268 4717 scope.go:117] "RemoveContainer" containerID="27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.921125 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wqxnc"] Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.945550 4717 scope.go:117] "RemoveContainer" containerID="6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.987486 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c040078-5a12-407b-9263-4fdc6967c379" path="/var/lib/kubelet/pods/4c040078-5a12-407b-9263-4fdc6967c379/volumes" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.999392 4717 scope.go:117] "RemoveContainer" containerID="5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21" Feb 21 22:33:41 crc kubenswrapper[4717]: E0221 22:33:41.999810 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21\": container with ID starting with 5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21 not found: ID does not exist" containerID="5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.999842 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21"} err="failed to get container status \"5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21\": rpc error: code = NotFound desc = could not find container \"5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21\": container with ID starting with 5266aca3bac8681b19a5aed14ff8b3f8155f1b4ddbcf20d9d29cfccb0ce84a21 not found: ID does not exist" Feb 21 22:33:41 crc kubenswrapper[4717]: I0221 22:33:41.999873 4717 scope.go:117] "RemoveContainer" containerID="27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee" Feb 21 22:33:42 crc kubenswrapper[4717]: E0221 22:33:42.000213 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee\": container with ID starting with 27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee not found: ID does not exist" containerID="27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee" Feb 21 22:33:42 crc kubenswrapper[4717]: I0221 22:33:42.000255 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee"} err="failed to get container status \"27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee\": rpc error: code = NotFound desc = could not find container \"27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee\": container with ID starting with 27141754172d611e2597a0634e37819f35b0eec711e9008402e8f677e72dd2ee not found: ID does not exist" Feb 21 22:33:42 crc kubenswrapper[4717]: I0221 22:33:42.000284 4717 scope.go:117] "RemoveContainer" containerID="6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7" Feb 21 22:33:42 crc kubenswrapper[4717]: E0221 22:33:42.000725 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7\": container with ID starting with 6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7 not found: ID does not exist" containerID="6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7" Feb 21 22:33:42 crc kubenswrapper[4717]: I0221 22:33:42.000748 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7"} err="failed to get container status \"6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7\": rpc error: code = NotFound desc = could not find container \"6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7\": container with ID starting with 6796f95cf2eac6b179a4620486a5cfa1abd4189221104c8c4bb20fd80d9878b7 not found: ID does not exist" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.866622 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5c8rh"] Feb 21 22:34:02 crc kubenswrapper[4717]: E0221 22:34:02.867814 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="extract-utilities" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.867836 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="extract-utilities" Feb 21 22:34:02 crc kubenswrapper[4717]: E0221 22:34:02.867892 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="registry-server" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.867904 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="registry-server" Feb 21 22:34:02 crc kubenswrapper[4717]: E0221 22:34:02.867955 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="extract-content" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.867969 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="extract-content" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.868277 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c040078-5a12-407b-9263-4fdc6967c379" containerName="registry-server" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.870391 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:02 crc kubenswrapper[4717]: I0221 22:34:02.881506 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5c8rh"] Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.023770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-utilities\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.024135 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-catalog-content\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.024352 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xd4d\" (UniqueName: \"kubernetes.io/projected/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-kube-api-access-8xd4d\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.126546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-utilities\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.126633 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-catalog-content\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.127184 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-catalog-content\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.127558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xd4d\" (UniqueName: \"kubernetes.io/projected/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-kube-api-access-8xd4d\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.128720 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-utilities\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.161374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xd4d\" (UniqueName: \"kubernetes.io/projected/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-kube-api-access-8xd4d\") pod \"redhat-operators-5c8rh\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.242966 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:03 crc kubenswrapper[4717]: I0221 22:34:03.735793 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5c8rh"] Feb 21 22:34:04 crc kubenswrapper[4717]: I0221 22:34:04.098242 4717 generic.go:334] "Generic (PLEG): container finished" podID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerID="07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a" exitCode=0 Feb 21 22:34:04 crc kubenswrapper[4717]: I0221 22:34:04.098308 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerDied","Data":"07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a"} Feb 21 22:34:04 crc kubenswrapper[4717]: I0221 22:34:04.098388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerStarted","Data":"17006dcc5ce4fd68554183dd8925085270b72cda43d855e9918cb783c60b5a1a"} Feb 21 22:34:04 crc kubenswrapper[4717]: I0221 22:34:04.100238 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:34:05 crc kubenswrapper[4717]: I0221 22:34:05.113208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerStarted","Data":"b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea"} Feb 21 22:34:06 crc kubenswrapper[4717]: I0221 22:34:06.131639 4717 generic.go:334] "Generic (PLEG): container finished" podID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerID="b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea" exitCode=0 Feb 21 22:34:06 crc kubenswrapper[4717]: I0221 22:34:06.131744 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerDied","Data":"b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea"} Feb 21 22:34:08 crc kubenswrapper[4717]: I0221 22:34:08.160644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerStarted","Data":"4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497"} Feb 21 22:34:08 crc kubenswrapper[4717]: I0221 22:34:08.201325 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5c8rh" podStartSLOduration=3.595378023 podStartE2EDuration="6.201294109s" podCreationTimestamp="2026-02-21 22:34:02 +0000 UTC" firstStartedPulling="2026-02-21 22:34:04.099925089 +0000 UTC m=+2858.881458711" lastFinishedPulling="2026-02-21 22:34:06.705841165 +0000 UTC m=+2861.487374797" observedRunningTime="2026-02-21 22:34:08.190415249 +0000 UTC m=+2862.971948921" watchObservedRunningTime="2026-02-21 22:34:08.201294109 +0000 UTC m=+2862.982827771" Feb 21 22:34:09 crc kubenswrapper[4717]: I0221 22:34:09.063057 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:34:09 crc kubenswrapper[4717]: I0221 22:34:09.063152 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:34:09 crc kubenswrapper[4717]: I0221 22:34:09.063225 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:34:09 crc kubenswrapper[4717]: I0221 22:34:09.064409 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:34:09 crc kubenswrapper[4717]: I0221 22:34:09.064517 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" gracePeriod=600 Feb 21 22:34:09 crc kubenswrapper[4717]: E0221 22:34:09.202527 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:34:10 crc kubenswrapper[4717]: I0221 22:34:10.182612 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" exitCode=0 Feb 21 22:34:10 crc kubenswrapper[4717]: I0221 22:34:10.182656 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674"} Feb 21 22:34:10 crc kubenswrapper[4717]: I0221 22:34:10.182686 4717 scope.go:117] "RemoveContainer" containerID="9d91e1e907ed42ee54b189c1b4dbcc4ded12cabfe2d35ffcc00d8405cf56fc50" Feb 21 22:34:10 crc kubenswrapper[4717]: I0221 22:34:10.183294 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:34:10 crc kubenswrapper[4717]: E0221 22:34:10.183772 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:34:13 crc kubenswrapper[4717]: I0221 22:34:13.243230 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:13 crc kubenswrapper[4717]: I0221 22:34:13.243761 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:14 crc kubenswrapper[4717]: I0221 22:34:14.311181 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5c8rh" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="registry-server" probeResult="failure" output=< Feb 21 22:34:14 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 22:34:14 crc kubenswrapper[4717]: > Feb 21 22:34:22 crc kubenswrapper[4717]: E0221 22:34:22.055771 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 21 22:34:23 crc kubenswrapper[4717]: I0221 22:34:23.312618 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:23 crc kubenswrapper[4717]: I0221 22:34:23.383891 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:23 crc kubenswrapper[4717]: I0221 22:34:23.564851 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5c8rh"] Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.342503 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5c8rh" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="registry-server" containerID="cri-o://4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497" gracePeriod=2 Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.904403 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.978616 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:34:24 crc kubenswrapper[4717]: E0221 22:34:24.978815 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.996577 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-catalog-content\") pod \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.996796 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xd4d\" (UniqueName: \"kubernetes.io/projected/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-kube-api-access-8xd4d\") pod \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.996857 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-utilities\") pod \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\" (UID: \"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43\") " Feb 21 22:34:24 crc kubenswrapper[4717]: I0221 22:34:24.997824 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-utilities" (OuterVolumeSpecName: "utilities") pod "46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" (UID: "46d7c80d-97ca-4e8b-a7e5-ee8a93053d43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.005408 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-kube-api-access-8xd4d" (OuterVolumeSpecName: "kube-api-access-8xd4d") pod "46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" (UID: "46d7c80d-97ca-4e8b-a7e5-ee8a93053d43"). InnerVolumeSpecName "kube-api-access-8xd4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.100103 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xd4d\" (UniqueName: \"kubernetes.io/projected/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-kube-api-access-8xd4d\") on node \"crc\" DevicePath \"\"" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.100135 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.113261 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" (UID: "46d7c80d-97ca-4e8b-a7e5-ee8a93053d43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.202586 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.355300 4717 generic.go:334] "Generic (PLEG): container finished" podID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerID="4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497" exitCode=0 Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.355341 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerDied","Data":"4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497"} Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.355366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c8rh" event={"ID":"46d7c80d-97ca-4e8b-a7e5-ee8a93053d43","Type":"ContainerDied","Data":"17006dcc5ce4fd68554183dd8925085270b72cda43d855e9918cb783c60b5a1a"} Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.355368 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c8rh" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.355382 4717 scope.go:117] "RemoveContainer" containerID="4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.394806 4717 scope.go:117] "RemoveContainer" containerID="b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.395838 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5c8rh"] Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.412893 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5c8rh"] Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.420267 4717 scope.go:117] "RemoveContainer" containerID="07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.488940 4717 scope.go:117] "RemoveContainer" containerID="4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497" Feb 21 22:34:25 crc kubenswrapper[4717]: E0221 22:34:25.489503 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497\": container with ID starting with 4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497 not found: ID does not exist" containerID="4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.489555 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497"} err="failed to get container status \"4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497\": rpc error: code = NotFound desc = could not find container \"4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497\": container with ID starting with 4f24b1ab05df1bb66b952e82000a609b4041114a49cf357175d45c4950602497 not found: ID does not exist" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.489594 4717 scope.go:117] "RemoveContainer" containerID="b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea" Feb 21 22:34:25 crc kubenswrapper[4717]: E0221 22:34:25.490199 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea\": container with ID starting with b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea not found: ID does not exist" containerID="b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.490244 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea"} err="failed to get container status \"b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea\": rpc error: code = NotFound desc = could not find container \"b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea\": container with ID starting with b98f123e781135c5a80c42d37f585e7bc56f8da58df2bd6a389f55ba30577aea not found: ID does not exist" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.490270 4717 scope.go:117] "RemoveContainer" containerID="07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a" Feb 21 22:34:25 crc kubenswrapper[4717]: E0221 22:34:25.490743 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a\": container with ID starting with 07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a not found: ID does not exist" containerID="07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a" Feb 21 22:34:25 crc kubenswrapper[4717]: I0221 22:34:25.490786 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a"} err="failed to get container status \"07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a\": rpc error: code = NotFound desc = could not find container \"07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a\": container with ID starting with 07603b71620ab3531de5f0feb5bcb281c4b38f9a060a8a931afc2d6382c3806a not found: ID does not exist" Feb 21 22:34:26 crc kubenswrapper[4717]: I0221 22:34:26.002026 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" path="/var/lib/kubelet/pods/46d7c80d-97ca-4e8b-a7e5-ee8a93053d43/volumes" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.180935 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xx7td"] Feb 21 22:34:30 crc kubenswrapper[4717]: E0221 22:34:30.182197 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="registry-server" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.182214 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="registry-server" Feb 21 22:34:30 crc kubenswrapper[4717]: E0221 22:34:30.182235 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="extract-utilities" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.182248 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="extract-utilities" Feb 21 22:34:30 crc kubenswrapper[4717]: E0221 22:34:30.182272 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="extract-content" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.182281 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="extract-content" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.182462 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d7c80d-97ca-4e8b-a7e5-ee8a93053d43" containerName="registry-server" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.184394 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.196000 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xx7td"] Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.305347 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-catalog-content\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.305399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-utilities\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.305478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flc8q\" (UniqueName: \"kubernetes.io/projected/edad7363-6509-4b8e-b904-a94d38d1e16a-kube-api-access-flc8q\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.406906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flc8q\" (UniqueName: \"kubernetes.io/projected/edad7363-6509-4b8e-b904-a94d38d1e16a-kube-api-access-flc8q\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.407062 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-catalog-content\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.407085 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-utilities\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.407558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-utilities\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.407892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-catalog-content\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.427794 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flc8q\" (UniqueName: \"kubernetes.io/projected/edad7363-6509-4b8e-b904-a94d38d1e16a-kube-api-access-flc8q\") pod \"community-operators-xx7td\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:30 crc kubenswrapper[4717]: I0221 22:34:30.565005 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:31 crc kubenswrapper[4717]: I0221 22:34:31.116792 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xx7td"] Feb 21 22:34:31 crc kubenswrapper[4717]: W0221 22:34:31.127673 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedad7363_6509_4b8e_b904_a94d38d1e16a.slice/crio-3c0e32c35d1245729786c5c6d955376453747b3d4952f4bb968d6fbb54e224e8 WatchSource:0}: Error finding container 3c0e32c35d1245729786c5c6d955376453747b3d4952f4bb968d6fbb54e224e8: Status 404 returned error can't find the container with id 3c0e32c35d1245729786c5c6d955376453747b3d4952f4bb968d6fbb54e224e8 Feb 21 22:34:31 crc kubenswrapper[4717]: I0221 22:34:31.408479 4717 generic.go:334] "Generic (PLEG): container finished" podID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerID="b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93" exitCode=0 Feb 21 22:34:31 crc kubenswrapper[4717]: I0221 22:34:31.408691 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerDied","Data":"b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93"} Feb 21 22:34:31 crc kubenswrapper[4717]: I0221 22:34:31.408984 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerStarted","Data":"3c0e32c35d1245729786c5c6d955376453747b3d4952f4bb968d6fbb54e224e8"} Feb 21 22:34:32 crc kubenswrapper[4717]: I0221 22:34:32.419443 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerStarted","Data":"be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6"} Feb 21 22:34:33 crc kubenswrapper[4717]: I0221 22:34:33.432347 4717 generic.go:334] "Generic (PLEG): container finished" podID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerID="be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6" exitCode=0 Feb 21 22:34:33 crc kubenswrapper[4717]: I0221 22:34:33.432403 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerDied","Data":"be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6"} Feb 21 22:34:34 crc kubenswrapper[4717]: I0221 22:34:34.443593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerStarted","Data":"e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5"} Feb 21 22:34:34 crc kubenswrapper[4717]: I0221 22:34:34.471361 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xx7td" podStartSLOduration=2.065868557 podStartE2EDuration="4.471327657s" podCreationTimestamp="2026-02-21 22:34:30 +0000 UTC" firstStartedPulling="2026-02-21 22:34:31.413900603 +0000 UTC m=+2886.195434225" lastFinishedPulling="2026-02-21 22:34:33.819359703 +0000 UTC m=+2888.600893325" observedRunningTime="2026-02-21 22:34:34.468407238 +0000 UTC m=+2889.249940870" watchObservedRunningTime="2026-02-21 22:34:34.471327657 +0000 UTC m=+2889.252861279" Feb 21 22:34:36 crc kubenswrapper[4717]: I0221 22:34:36.976217 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:34:36 crc kubenswrapper[4717]: E0221 22:34:36.977111 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:34:40 crc kubenswrapper[4717]: I0221 22:34:40.565576 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:40 crc kubenswrapper[4717]: I0221 22:34:40.566064 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:40 crc kubenswrapper[4717]: I0221 22:34:40.649640 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:41 crc kubenswrapper[4717]: I0221 22:34:41.601023 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:41 crc kubenswrapper[4717]: I0221 22:34:41.670572 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xx7td"] Feb 21 22:34:43 crc kubenswrapper[4717]: I0221 22:34:43.539800 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xx7td" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="registry-server" containerID="cri-o://e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5" gracePeriod=2 Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.070644 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.234743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-utilities\") pod \"edad7363-6509-4b8e-b904-a94d38d1e16a\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.234983 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-catalog-content\") pod \"edad7363-6509-4b8e-b904-a94d38d1e16a\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.235102 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flc8q\" (UniqueName: \"kubernetes.io/projected/edad7363-6509-4b8e-b904-a94d38d1e16a-kube-api-access-flc8q\") pod \"edad7363-6509-4b8e-b904-a94d38d1e16a\" (UID: \"edad7363-6509-4b8e-b904-a94d38d1e16a\") " Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.235854 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-utilities" (OuterVolumeSpecName: "utilities") pod "edad7363-6509-4b8e-b904-a94d38d1e16a" (UID: "edad7363-6509-4b8e-b904-a94d38d1e16a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.242148 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edad7363-6509-4b8e-b904-a94d38d1e16a-kube-api-access-flc8q" (OuterVolumeSpecName: "kube-api-access-flc8q") pod "edad7363-6509-4b8e-b904-a94d38d1e16a" (UID: "edad7363-6509-4b8e-b904-a94d38d1e16a"). InnerVolumeSpecName "kube-api-access-flc8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.295366 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edad7363-6509-4b8e-b904-a94d38d1e16a" (UID: "edad7363-6509-4b8e-b904-a94d38d1e16a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.336994 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.337029 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edad7363-6509-4b8e-b904-a94d38d1e16a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.337047 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flc8q\" (UniqueName: \"kubernetes.io/projected/edad7363-6509-4b8e-b904-a94d38d1e16a-kube-api-access-flc8q\") on node \"crc\" DevicePath \"\"" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.554065 4717 generic.go:334] "Generic (PLEG): container finished" podID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerID="e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5" exitCode=0 Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.554125 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerDied","Data":"e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5"} Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.554154 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx7td" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.554180 4717 scope.go:117] "RemoveContainer" containerID="e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.554163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx7td" event={"ID":"edad7363-6509-4b8e-b904-a94d38d1e16a","Type":"ContainerDied","Data":"3c0e32c35d1245729786c5c6d955376453747b3d4952f4bb968d6fbb54e224e8"} Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.588496 4717 scope.go:117] "RemoveContainer" containerID="be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.609545 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xx7td"] Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.623576 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xx7td"] Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.631816 4717 scope.go:117] "RemoveContainer" containerID="b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.679156 4717 scope.go:117] "RemoveContainer" containerID="e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5" Feb 21 22:34:44 crc kubenswrapper[4717]: E0221 22:34:44.679802 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5\": container with ID starting with e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5 not found: ID does not exist" containerID="e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.679934 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5"} err="failed to get container status \"e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5\": rpc error: code = NotFound desc = could not find container \"e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5\": container with ID starting with e70dd1c0b4ca6c29b9c85109e869ff76ea0a98e9806c4c5aad0bda7abf1510d5 not found: ID does not exist" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.679994 4717 scope.go:117] "RemoveContainer" containerID="be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6" Feb 21 22:34:44 crc kubenswrapper[4717]: E0221 22:34:44.680839 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6\": container with ID starting with be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6 not found: ID does not exist" containerID="be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.680917 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6"} err="failed to get container status \"be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6\": rpc error: code = NotFound desc = could not find container \"be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6\": container with ID starting with be68495798eafd47c78794f8533deb2fdeff4ac015802168994c790a6349e1a6 not found: ID does not exist" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.680953 4717 scope.go:117] "RemoveContainer" containerID="b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93" Feb 21 22:34:44 crc kubenswrapper[4717]: E0221 22:34:44.681618 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93\": container with ID starting with b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93 not found: ID does not exist" containerID="b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93" Feb 21 22:34:44 crc kubenswrapper[4717]: I0221 22:34:44.681691 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93"} err="failed to get container status \"b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93\": rpc error: code = NotFound desc = could not find container \"b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93\": container with ID starting with b1c3e624c9cb2d25583812c7cea622f767f94d0c4ab02cf3287d9acece76df93 not found: ID does not exist" Feb 21 22:34:45 crc kubenswrapper[4717]: I0221 22:34:45.992693 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" path="/var/lib/kubelet/pods/edad7363-6509-4b8e-b904-a94d38d1e16a/volumes" Feb 21 22:34:50 crc kubenswrapper[4717]: I0221 22:34:50.976761 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:34:50 crc kubenswrapper[4717]: E0221 22:34:50.977481 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:35:03 crc kubenswrapper[4717]: I0221 22:35:03.977129 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:35:03 crc kubenswrapper[4717]: E0221 22:35:03.978193 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:35:16 crc kubenswrapper[4717]: I0221 22:35:15.999910 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:35:16 crc kubenswrapper[4717]: E0221 22:35:16.004007 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:35:30 crc kubenswrapper[4717]: I0221 22:35:30.977205 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:35:30 crc kubenswrapper[4717]: E0221 22:35:30.978540 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:35:45 crc kubenswrapper[4717]: I0221 22:35:45.991202 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:35:45 crc kubenswrapper[4717]: E0221 22:35:45.992137 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:36:00 crc kubenswrapper[4717]: I0221 22:36:00.977515 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:36:00 crc kubenswrapper[4717]: E0221 22:36:00.978564 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.900799 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z942q"] Feb 21 22:36:04 crc kubenswrapper[4717]: E0221 22:36:04.901480 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="registry-server" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.901493 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="registry-server" Feb 21 22:36:04 crc kubenswrapper[4717]: E0221 22:36:04.901515 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="extract-content" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.901521 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="extract-content" Feb 21 22:36:04 crc kubenswrapper[4717]: E0221 22:36:04.901547 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="extract-utilities" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.901554 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="extract-utilities" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.901720 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="edad7363-6509-4b8e-b904-a94d38d1e16a" containerName="registry-server" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.903218 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.934596 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z942q"] Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.966094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlcl\" (UniqueName: \"kubernetes.io/projected/9eb96c13-0790-449f-ab6a-08c5094e37b5-kube-api-access-hjlcl\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.966150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-catalog-content\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:04 crc kubenswrapper[4717]: I0221 22:36:04.966200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-utilities\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.067655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlcl\" (UniqueName: \"kubernetes.io/projected/9eb96c13-0790-449f-ab6a-08c5094e37b5-kube-api-access-hjlcl\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.067721 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-catalog-content\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.067822 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-utilities\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.068629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-catalog-content\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.068816 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-utilities\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.090850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlcl\" (UniqueName: \"kubernetes.io/projected/9eb96c13-0790-449f-ab6a-08c5094e37b5-kube-api-access-hjlcl\") pod \"certified-operators-z942q\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.226950 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:05 crc kubenswrapper[4717]: I0221 22:36:05.698744 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z942q"] Feb 21 22:36:06 crc kubenswrapper[4717]: I0221 22:36:06.456803 4717 generic.go:334] "Generic (PLEG): container finished" podID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerID="89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d" exitCode=0 Feb 21 22:36:06 crc kubenswrapper[4717]: I0221 22:36:06.456884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z942q" event={"ID":"9eb96c13-0790-449f-ab6a-08c5094e37b5","Type":"ContainerDied","Data":"89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d"} Feb 21 22:36:06 crc kubenswrapper[4717]: I0221 22:36:06.457244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z942q" event={"ID":"9eb96c13-0790-449f-ab6a-08c5094e37b5","Type":"ContainerStarted","Data":"57018e3acf42b128d506ba618580e7877a5194aa31330b0a384b79bfe8d07a1b"} Feb 21 22:36:08 crc kubenswrapper[4717]: I0221 22:36:08.489256 4717 generic.go:334] "Generic (PLEG): container finished" podID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerID="01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af" exitCode=0 Feb 21 22:36:08 crc kubenswrapper[4717]: I0221 22:36:08.489314 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z942q" event={"ID":"9eb96c13-0790-449f-ab6a-08c5094e37b5","Type":"ContainerDied","Data":"01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af"} Feb 21 22:36:09 crc kubenswrapper[4717]: I0221 22:36:09.503016 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z942q" event={"ID":"9eb96c13-0790-449f-ab6a-08c5094e37b5","Type":"ContainerStarted","Data":"57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22"} Feb 21 22:36:09 crc kubenswrapper[4717]: I0221 22:36:09.532686 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z942q" podStartSLOduration=3.107117729 podStartE2EDuration="5.532661216s" podCreationTimestamp="2026-02-21 22:36:04 +0000 UTC" firstStartedPulling="2026-02-21 22:36:06.45836107 +0000 UTC m=+2981.239894702" lastFinishedPulling="2026-02-21 22:36:08.883904547 +0000 UTC m=+2983.665438189" observedRunningTime="2026-02-21 22:36:09.528813354 +0000 UTC m=+2984.310347016" watchObservedRunningTime="2026-02-21 22:36:09.532661216 +0000 UTC m=+2984.314194858" Feb 21 22:36:15 crc kubenswrapper[4717]: I0221 22:36:15.227274 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:15 crc kubenswrapper[4717]: I0221 22:36:15.227817 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:15 crc kubenswrapper[4717]: I0221 22:36:15.313659 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:15 crc kubenswrapper[4717]: I0221 22:36:15.661079 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:15 crc kubenswrapper[4717]: I0221 22:36:15.726526 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z942q"] Feb 21 22:36:15 crc kubenswrapper[4717]: I0221 22:36:15.977770 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:36:15 crc kubenswrapper[4717]: E0221 22:36:15.978292 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:36:17 crc kubenswrapper[4717]: I0221 22:36:17.604294 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z942q" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="registry-server" containerID="cri-o://57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22" gracePeriod=2 Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.189125 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.239337 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-utilities\") pod \"9eb96c13-0790-449f-ab6a-08c5094e37b5\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.239407 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjlcl\" (UniqueName: \"kubernetes.io/projected/9eb96c13-0790-449f-ab6a-08c5094e37b5-kube-api-access-hjlcl\") pod \"9eb96c13-0790-449f-ab6a-08c5094e37b5\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.239441 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-catalog-content\") pod \"9eb96c13-0790-449f-ab6a-08c5094e37b5\" (UID: \"9eb96c13-0790-449f-ab6a-08c5094e37b5\") " Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.240570 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-utilities" (OuterVolumeSpecName: "utilities") pod "9eb96c13-0790-449f-ab6a-08c5094e37b5" (UID: "9eb96c13-0790-449f-ab6a-08c5094e37b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.244621 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb96c13-0790-449f-ab6a-08c5094e37b5-kube-api-access-hjlcl" (OuterVolumeSpecName: "kube-api-access-hjlcl") pod "9eb96c13-0790-449f-ab6a-08c5094e37b5" (UID: "9eb96c13-0790-449f-ab6a-08c5094e37b5"). InnerVolumeSpecName "kube-api-access-hjlcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.341951 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.342272 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjlcl\" (UniqueName: \"kubernetes.io/projected/9eb96c13-0790-449f-ab6a-08c5094e37b5-kube-api-access-hjlcl\") on node \"crc\" DevicePath \"\"" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.537771 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9eb96c13-0790-449f-ab6a-08c5094e37b5" (UID: "9eb96c13-0790-449f-ab6a-08c5094e37b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.546897 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9eb96c13-0790-449f-ab6a-08c5094e37b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.617107 4717 generic.go:334] "Generic (PLEG): container finished" podID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerID="57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22" exitCode=0 Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.617159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z942q" event={"ID":"9eb96c13-0790-449f-ab6a-08c5094e37b5","Type":"ContainerDied","Data":"57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22"} Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.617225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z942q" event={"ID":"9eb96c13-0790-449f-ab6a-08c5094e37b5","Type":"ContainerDied","Data":"57018e3acf42b128d506ba618580e7877a5194aa31330b0a384b79bfe8d07a1b"} Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.617255 4717 scope.go:117] "RemoveContainer" containerID="57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.618026 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z942q" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.638182 4717 scope.go:117] "RemoveContainer" containerID="01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.661371 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z942q"] Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.683713 4717 scope.go:117] "RemoveContainer" containerID="89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.693345 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z942q"] Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.715545 4717 scope.go:117] "RemoveContainer" containerID="57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22" Feb 21 22:36:18 crc kubenswrapper[4717]: E0221 22:36:18.716052 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22\": container with ID starting with 57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22 not found: ID does not exist" containerID="57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.716105 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22"} err="failed to get container status \"57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22\": rpc error: code = NotFound desc = could not find container \"57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22\": container with ID starting with 57cd8d7a62f9a2cf701842747760a4b91eb6fc2cf59ce36e06055771ded6bc22 not found: ID does not exist" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.716134 4717 scope.go:117] "RemoveContainer" containerID="01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af" Feb 21 22:36:18 crc kubenswrapper[4717]: E0221 22:36:18.716559 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af\": container with ID starting with 01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af not found: ID does not exist" containerID="01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.716601 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af"} err="failed to get container status \"01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af\": rpc error: code = NotFound desc = could not find container \"01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af\": container with ID starting with 01dee68d36475603de266ba2ad5fbf774bd666f6d86ad6256fca8fa4b0c505af not found: ID does not exist" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.716633 4717 scope.go:117] "RemoveContainer" containerID="89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d" Feb 21 22:36:18 crc kubenswrapper[4717]: E0221 22:36:18.716966 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d\": container with ID starting with 89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d not found: ID does not exist" containerID="89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d" Feb 21 22:36:18 crc kubenswrapper[4717]: I0221 22:36:18.717024 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d"} err="failed to get container status \"89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d\": rpc error: code = NotFound desc = could not find container \"89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d\": container with ID starting with 89c51d226d527e2ca70ea65725ffa842e02d52d7f99308fad58a1123b4097c8d not found: ID does not exist" Feb 21 22:36:19 crc kubenswrapper[4717]: I0221 22:36:19.987403 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" path="/var/lib/kubelet/pods/9eb96c13-0790-449f-ab6a-08c5094e37b5/volumes" Feb 21 22:36:30 crc kubenswrapper[4717]: I0221 22:36:30.976605 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:36:30 crc kubenswrapper[4717]: E0221 22:36:30.977777 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:36:45 crc kubenswrapper[4717]: I0221 22:36:45.987878 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:36:45 crc kubenswrapper[4717]: E0221 22:36:45.988577 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:36:56 crc kubenswrapper[4717]: I0221 22:36:56.978110 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:36:56 crc kubenswrapper[4717]: E0221 22:36:56.979050 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:37:08 crc kubenswrapper[4717]: I0221 22:37:08.977454 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:37:08 crc kubenswrapper[4717]: E0221 22:37:08.978299 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:37:22 crc kubenswrapper[4717]: I0221 22:37:22.977911 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:37:22 crc kubenswrapper[4717]: E0221 22:37:22.978715 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:37:35 crc kubenswrapper[4717]: I0221 22:37:35.990405 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:37:35 crc kubenswrapper[4717]: E0221 22:37:35.991143 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:37:47 crc kubenswrapper[4717]: I0221 22:37:47.976478 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:37:47 crc kubenswrapper[4717]: E0221 22:37:47.977467 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:38:00 crc kubenswrapper[4717]: I0221 22:38:00.977638 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:38:00 crc kubenswrapper[4717]: E0221 22:38:00.979085 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:38:13 crc kubenswrapper[4717]: I0221 22:38:13.977251 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:38:13 crc kubenswrapper[4717]: E0221 22:38:13.978410 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:38:25 crc kubenswrapper[4717]: I0221 22:38:25.985199 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:38:25 crc kubenswrapper[4717]: E0221 22:38:25.986441 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:38:39 crc kubenswrapper[4717]: I0221 22:38:39.976481 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:38:39 crc kubenswrapper[4717]: E0221 22:38:39.977168 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:38:50 crc kubenswrapper[4717]: I0221 22:38:50.977361 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:38:50 crc kubenswrapper[4717]: E0221 22:38:50.978628 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:39:04 crc kubenswrapper[4717]: I0221 22:39:04.976468 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:39:04 crc kubenswrapper[4717]: E0221 22:39:04.977488 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:39:17 crc kubenswrapper[4717]: I0221 22:39:17.976714 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:39:18 crc kubenswrapper[4717]: I0221 22:39:18.517788 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"ccc78a7a4117f5c2e21732ec2593c73a83c28b2a3108d6d9f273300c9f08f15e"} Feb 21 22:40:17 crc kubenswrapper[4717]: I0221 22:40:17.135360 4717 generic.go:334] "Generic (PLEG): container finished" podID="4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" containerID="60a6b28a7b2140fa174fbdae0254a5e430559ecadfe34bc38cee37fd5986dcdb" exitCode=0 Feb 21 22:40:17 crc kubenswrapper[4717]: I0221 22:40:17.135467 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35","Type":"ContainerDied","Data":"60a6b28a7b2140fa174fbdae0254a5e430559ecadfe34bc38cee37fd5986dcdb"} Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.633941 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.730167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.730272 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ca-certs\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.730454 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-temporary\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.730587 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.730827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ssh-key\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.731083 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-config-data\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.731791 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.732379 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-config-data" (OuterVolumeSpecName: "config-data") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.732756 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-workdir\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.732856 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ltm8\" (UniqueName: \"kubernetes.io/projected/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-kube-api-access-6ltm8\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.732933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config-secret\") pod \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\" (UID: \"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35\") " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.735837 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-config-data\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.735945 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.738026 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.741944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-kube-api-access-6ltm8" (OuterVolumeSpecName: "kube-api-access-6ltm8") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "kube-api-access-6ltm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.744339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.768973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.771993 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.779227 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.810148 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" (UID: "4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838060 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838202 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838282 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838382 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ltm8\" (UniqueName: \"kubernetes.io/projected/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-kube-api-access-6ltm8\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838468 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838542 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.838627 4717 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.858973 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 21 22:40:18 crc kubenswrapper[4717]: I0221 22:40:18.940909 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 21 22:40:19 crc kubenswrapper[4717]: I0221 22:40:19.165550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35","Type":"ContainerDied","Data":"afd8bbf3f0d5e3b0a2a56a30fc133c9e2060a62380c7b8a8441b9792417cb82a"} Feb 21 22:40:19 crc kubenswrapper[4717]: I0221 22:40:19.165633 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd8bbf3f0d5e3b0a2a56a30fc133c9e2060a62380c7b8a8441b9792417cb82a" Feb 21 22:40:19 crc kubenswrapper[4717]: I0221 22:40:19.165686 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.713151 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 21 22:40:26 crc kubenswrapper[4717]: E0221 22:40:26.714321 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="extract-content" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.714346 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="extract-content" Feb 21 22:40:26 crc kubenswrapper[4717]: E0221 22:40:26.714371 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="registry-server" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.714385 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="registry-server" Feb 21 22:40:26 crc kubenswrapper[4717]: E0221 22:40:26.714443 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="extract-utilities" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.714457 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="extract-utilities" Feb 21 22:40:26 crc kubenswrapper[4717]: E0221 22:40:26.714473 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" containerName="tempest-tests-tempest-tests-runner" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.714486 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" containerName="tempest-tests-tempest-tests-runner" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.714806 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb96c13-0790-449f-ab6a-08c5094e37b5" containerName="registry-server" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.714849 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35" containerName="tempest-tests-tempest-tests-runner" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.715832 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.718549 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nxzb2" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.733001 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.811952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d2cj\" (UniqueName: \"kubernetes.io/projected/943cf36f-ab88-4c12-a0c6-455facbc74ad-kube-api-access-9d2cj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.812082 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.913755 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d2cj\" (UniqueName: \"kubernetes.io/projected/943cf36f-ab88-4c12-a0c6-455facbc74ad-kube-api-access-9d2cj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.914343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.914789 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.939548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d2cj\" (UniqueName: \"kubernetes.io/projected/943cf36f-ab88-4c12-a0c6-455facbc74ad-kube-api-access-9d2cj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:26 crc kubenswrapper[4717]: I0221 22:40:26.955295 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943cf36f-ab88-4c12-a0c6-455facbc74ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:27 crc kubenswrapper[4717]: I0221 22:40:27.046325 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 21 22:40:27 crc kubenswrapper[4717]: I0221 22:40:27.496988 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 21 22:40:27 crc kubenswrapper[4717]: I0221 22:40:27.511460 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:40:28 crc kubenswrapper[4717]: I0221 22:40:28.273569 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"943cf36f-ab88-4c12-a0c6-455facbc74ad","Type":"ContainerStarted","Data":"f838d9cbfff2db355b98605cf21ea3ed0fbb2e1b764dc2bfbb01c2c6a0c43a62"} Feb 21 22:40:29 crc kubenswrapper[4717]: I0221 22:40:29.286828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"943cf36f-ab88-4c12-a0c6-455facbc74ad","Type":"ContainerStarted","Data":"9c821407e0372e2c9f96478b371d240f6a65f82ae8539f5f8fde45b38f6e28f1"} Feb 21 22:40:29 crc kubenswrapper[4717]: I0221 22:40:29.306846 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.391682007 podStartE2EDuration="3.306821101s" podCreationTimestamp="2026-02-21 22:40:26 +0000 UTC" firstStartedPulling="2026-02-21 22:40:27.511051319 +0000 UTC m=+3242.292584951" lastFinishedPulling="2026-02-21 22:40:28.426190383 +0000 UTC m=+3243.207724045" observedRunningTime="2026-02-21 22:40:29.302715703 +0000 UTC m=+3244.084249365" watchObservedRunningTime="2026-02-21 22:40:29.306821101 +0000 UTC m=+3244.088354763" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.236229 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f2s4m/must-gather-956j5"] Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.238476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.249960 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f2s4m"/"default-dockercfg-jfpjc" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.251140 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f2s4m"/"openshift-service-ca.crt" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.251478 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f2s4m"/"kube-root-ca.crt" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.288234 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f2s4m/must-gather-956j5"] Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.338703 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1878cb1-51e8-4916-9bff-b056af0bb210-must-gather-output\") pod \"must-gather-956j5\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.338907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjblh\" (UniqueName: \"kubernetes.io/projected/d1878cb1-51e8-4916-9bff-b056af0bb210-kube-api-access-gjblh\") pod \"must-gather-956j5\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.440727 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjblh\" (UniqueName: \"kubernetes.io/projected/d1878cb1-51e8-4916-9bff-b056af0bb210-kube-api-access-gjblh\") pod \"must-gather-956j5\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.440826 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1878cb1-51e8-4916-9bff-b056af0bb210-must-gather-output\") pod \"must-gather-956j5\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.441320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1878cb1-51e8-4916-9bff-b056af0bb210-must-gather-output\") pod \"must-gather-956j5\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.466576 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjblh\" (UniqueName: \"kubernetes.io/projected/d1878cb1-51e8-4916-9bff-b056af0bb210-kube-api-access-gjblh\") pod \"must-gather-956j5\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.558215 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:40:50 crc kubenswrapper[4717]: I0221 22:40:50.785523 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f2s4m/must-gather-956j5"] Feb 21 22:40:51 crc kubenswrapper[4717]: I0221 22:40:51.537222 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/must-gather-956j5" event={"ID":"d1878cb1-51e8-4916-9bff-b056af0bb210","Type":"ContainerStarted","Data":"1eebd89aa67dbb726a0a355621614b41d302e7fb2d5fef2dc4b302414fe4b7bc"} Feb 21 22:40:57 crc kubenswrapper[4717]: I0221 22:40:57.600470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/must-gather-956j5" event={"ID":"d1878cb1-51e8-4916-9bff-b056af0bb210","Type":"ContainerStarted","Data":"7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d"} Feb 21 22:40:58 crc kubenswrapper[4717]: I0221 22:40:58.615560 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/must-gather-956j5" event={"ID":"d1878cb1-51e8-4916-9bff-b056af0bb210","Type":"ContainerStarted","Data":"a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f"} Feb 21 22:40:58 crc kubenswrapper[4717]: I0221 22:40:58.651467 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f2s4m/must-gather-956j5" podStartSLOduration=2.477055517 podStartE2EDuration="8.651439516s" podCreationTimestamp="2026-02-21 22:40:50 +0000 UTC" firstStartedPulling="2026-02-21 22:40:50.78987819 +0000 UTC m=+3265.571411822" lastFinishedPulling="2026-02-21 22:40:56.964262189 +0000 UTC m=+3271.745795821" observedRunningTime="2026-02-21 22:40:58.63989672 +0000 UTC m=+3273.421430382" watchObservedRunningTime="2026-02-21 22:40:58.651439516 +0000 UTC m=+3273.432973178" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.048934 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-jcz2x"] Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.053423 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.150888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqdkp\" (UniqueName: \"kubernetes.io/projected/32141a39-e760-4107-b779-7cc07ac434a1-kube-api-access-cqdkp\") pod \"crc-debug-jcz2x\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.151033 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32141a39-e760-4107-b779-7cc07ac434a1-host\") pod \"crc-debug-jcz2x\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.252511 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqdkp\" (UniqueName: \"kubernetes.io/projected/32141a39-e760-4107-b779-7cc07ac434a1-kube-api-access-cqdkp\") pod \"crc-debug-jcz2x\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.252701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32141a39-e760-4107-b779-7cc07ac434a1-host\") pod \"crc-debug-jcz2x\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.252802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32141a39-e760-4107-b779-7cc07ac434a1-host\") pod \"crc-debug-jcz2x\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.283954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqdkp\" (UniqueName: \"kubernetes.io/projected/32141a39-e760-4107-b779-7cc07ac434a1-kube-api-access-cqdkp\") pod \"crc-debug-jcz2x\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.374353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:01 crc kubenswrapper[4717]: W0221 22:41:01.414840 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32141a39_e760_4107_b779_7cc07ac434a1.slice/crio-5203b7545e1dd9a8f83860a5e5d25af4f55fea526e25e34e1bb9cc0edc5b0f92 WatchSource:0}: Error finding container 5203b7545e1dd9a8f83860a5e5d25af4f55fea526e25e34e1bb9cc0edc5b0f92: Status 404 returned error can't find the container with id 5203b7545e1dd9a8f83860a5e5d25af4f55fea526e25e34e1bb9cc0edc5b0f92 Feb 21 22:41:01 crc kubenswrapper[4717]: I0221 22:41:01.646375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" event={"ID":"32141a39-e760-4107-b779-7cc07ac434a1","Type":"ContainerStarted","Data":"5203b7545e1dd9a8f83860a5e5d25af4f55fea526e25e34e1bb9cc0edc5b0f92"} Feb 21 22:41:13 crc kubenswrapper[4717]: I0221 22:41:13.745826 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" event={"ID":"32141a39-e760-4107-b779-7cc07ac434a1","Type":"ContainerStarted","Data":"dd7d1ed4db57890e95842b252381f954dc75e2cc3690ba7af3e48a9d775c744d"} Feb 21 22:41:13 crc kubenswrapper[4717]: I0221 22:41:13.761257 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" podStartSLOduration=1.502470059 podStartE2EDuration="12.761237826s" podCreationTimestamp="2026-02-21 22:41:01 +0000 UTC" firstStartedPulling="2026-02-21 22:41:01.421379231 +0000 UTC m=+3276.202912893" lastFinishedPulling="2026-02-21 22:41:12.680147038 +0000 UTC m=+3287.461680660" observedRunningTime="2026-02-21 22:41:13.757391834 +0000 UTC m=+3288.538925456" watchObservedRunningTime="2026-02-21 22:41:13.761237826 +0000 UTC m=+3288.542771458" Feb 21 22:41:39 crc kubenswrapper[4717]: I0221 22:41:39.063045 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:41:39 crc kubenswrapper[4717]: I0221 22:41:39.063531 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:41:50 crc kubenswrapper[4717]: I0221 22:41:50.105022 4717 generic.go:334] "Generic (PLEG): container finished" podID="32141a39-e760-4107-b779-7cc07ac434a1" containerID="dd7d1ed4db57890e95842b252381f954dc75e2cc3690ba7af3e48a9d775c744d" exitCode=0 Feb 21 22:41:50 crc kubenswrapper[4717]: I0221 22:41:50.105111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" event={"ID":"32141a39-e760-4107-b779-7cc07ac434a1","Type":"ContainerDied","Data":"dd7d1ed4db57890e95842b252381f954dc75e2cc3690ba7af3e48a9d775c744d"} Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.218433 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.268526 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-jcz2x"] Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.281382 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-jcz2x"] Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.329121 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqdkp\" (UniqueName: \"kubernetes.io/projected/32141a39-e760-4107-b779-7cc07ac434a1-kube-api-access-cqdkp\") pod \"32141a39-e760-4107-b779-7cc07ac434a1\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.329171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32141a39-e760-4107-b779-7cc07ac434a1-host\") pod \"32141a39-e760-4107-b779-7cc07ac434a1\" (UID: \"32141a39-e760-4107-b779-7cc07ac434a1\") " Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.329320 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32141a39-e760-4107-b779-7cc07ac434a1-host" (OuterVolumeSpecName: "host") pod "32141a39-e760-4107-b779-7cc07ac434a1" (UID: "32141a39-e760-4107-b779-7cc07ac434a1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.329736 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32141a39-e760-4107-b779-7cc07ac434a1-host\") on node \"crc\" DevicePath \"\"" Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.334781 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32141a39-e760-4107-b779-7cc07ac434a1-kube-api-access-cqdkp" (OuterVolumeSpecName: "kube-api-access-cqdkp") pod "32141a39-e760-4107-b779-7cc07ac434a1" (UID: "32141a39-e760-4107-b779-7cc07ac434a1"). InnerVolumeSpecName "kube-api-access-cqdkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.431687 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqdkp\" (UniqueName: \"kubernetes.io/projected/32141a39-e760-4107-b779-7cc07ac434a1-kube-api-access-cqdkp\") on node \"crc\" DevicePath \"\"" Feb 21 22:41:51 crc kubenswrapper[4717]: I0221 22:41:51.989969 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32141a39-e760-4107-b779-7cc07ac434a1" path="/var/lib/kubelet/pods/32141a39-e760-4107-b779-7cc07ac434a1/volumes" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.127258 4717 scope.go:117] "RemoveContainer" containerID="dd7d1ed4db57890e95842b252381f954dc75e2cc3690ba7af3e48a9d775c744d" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.127315 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-jcz2x" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.444598 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-hbw9t"] Feb 21 22:41:52 crc kubenswrapper[4717]: E0221 22:41:52.445006 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32141a39-e760-4107-b779-7cc07ac434a1" containerName="container-00" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.445021 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="32141a39-e760-4107-b779-7cc07ac434a1" containerName="container-00" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.445200 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="32141a39-e760-4107-b779-7cc07ac434a1" containerName="container-00" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.445754 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.553333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9kk\" (UniqueName: \"kubernetes.io/projected/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-kube-api-access-zd9kk\") pod \"crc-debug-hbw9t\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.553506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-host\") pod \"crc-debug-hbw9t\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.655719 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9kk\" (UniqueName: \"kubernetes.io/projected/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-kube-api-access-zd9kk\") pod \"crc-debug-hbw9t\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.655845 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-host\") pod \"crc-debug-hbw9t\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.655984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-host\") pod \"crc-debug-hbw9t\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.674453 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9kk\" (UniqueName: \"kubernetes.io/projected/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-kube-api-access-zd9kk\") pod \"crc-debug-hbw9t\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:52 crc kubenswrapper[4717]: I0221 22:41:52.767531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:53 crc kubenswrapper[4717]: I0221 22:41:53.134941 4717 generic.go:334] "Generic (PLEG): container finished" podID="a8d2afbf-20c0-468c-8e03-453e3d90b2a7" containerID="48f67b8cd5e3a42e70ed8e3f33436cfe7ca461d8d3499ea3d0d27bad3ebaf44e" exitCode=0 Feb 21 22:41:53 crc kubenswrapper[4717]: I0221 22:41:53.135047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" event={"ID":"a8d2afbf-20c0-468c-8e03-453e3d90b2a7","Type":"ContainerDied","Data":"48f67b8cd5e3a42e70ed8e3f33436cfe7ca461d8d3499ea3d0d27bad3ebaf44e"} Feb 21 22:41:53 crc kubenswrapper[4717]: I0221 22:41:53.135126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" event={"ID":"a8d2afbf-20c0-468c-8e03-453e3d90b2a7","Type":"ContainerStarted","Data":"0784a1a36d9d97e8556b02817958275a983f57339e48b4ee59ce105e538600bb"} Feb 21 22:41:53 crc kubenswrapper[4717]: I0221 22:41:53.620105 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-hbw9t"] Feb 21 22:41:53 crc kubenswrapper[4717]: I0221 22:41:53.629500 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-hbw9t"] Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.238553 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.385210 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-host\") pod \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.385329 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-host" (OuterVolumeSpecName: "host") pod "a8d2afbf-20c0-468c-8e03-453e3d90b2a7" (UID: "a8d2afbf-20c0-468c-8e03-453e3d90b2a7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.385402 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd9kk\" (UniqueName: \"kubernetes.io/projected/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-kube-api-access-zd9kk\") pod \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\" (UID: \"a8d2afbf-20c0-468c-8e03-453e3d90b2a7\") " Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.386245 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-host\") on node \"crc\" DevicePath \"\"" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.406353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-kube-api-access-zd9kk" (OuterVolumeSpecName: "kube-api-access-zd9kk") pod "a8d2afbf-20c0-468c-8e03-453e3d90b2a7" (UID: "a8d2afbf-20c0-468c-8e03-453e3d90b2a7"). InnerVolumeSpecName "kube-api-access-zd9kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.487389 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd9kk\" (UniqueName: \"kubernetes.io/projected/a8d2afbf-20c0-468c-8e03-453e3d90b2a7-kube-api-access-zd9kk\") on node \"crc\" DevicePath \"\"" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.804318 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-4b2ld"] Feb 21 22:41:54 crc kubenswrapper[4717]: E0221 22:41:54.805019 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d2afbf-20c0-468c-8e03-453e3d90b2a7" containerName="container-00" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.805032 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d2afbf-20c0-468c-8e03-453e3d90b2a7" containerName="container-00" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.805214 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d2afbf-20c0-468c-8e03-453e3d90b2a7" containerName="container-00" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.805787 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.895225 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmnjp\" (UniqueName: \"kubernetes.io/projected/0480db83-97bc-41da-81ea-891f9c65c997-kube-api-access-rmnjp\") pod \"crc-debug-4b2ld\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.895324 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0480db83-97bc-41da-81ea-891f9c65c997-host\") pod \"crc-debug-4b2ld\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.997180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmnjp\" (UniqueName: \"kubernetes.io/projected/0480db83-97bc-41da-81ea-891f9c65c997-kube-api-access-rmnjp\") pod \"crc-debug-4b2ld\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.997262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0480db83-97bc-41da-81ea-891f9c65c997-host\") pod \"crc-debug-4b2ld\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:54 crc kubenswrapper[4717]: I0221 22:41:54.997488 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0480db83-97bc-41da-81ea-891f9c65c997-host\") pod \"crc-debug-4b2ld\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:55 crc kubenswrapper[4717]: I0221 22:41:55.022511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmnjp\" (UniqueName: \"kubernetes.io/projected/0480db83-97bc-41da-81ea-891f9c65c997-kube-api-access-rmnjp\") pod \"crc-debug-4b2ld\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:55 crc kubenswrapper[4717]: I0221 22:41:55.122058 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:55 crc kubenswrapper[4717]: W0221 22:41:55.149076 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0480db83_97bc_41da_81ea_891f9c65c997.slice/crio-c4d7142a013b73079df20ed90643fdab8ee7835ba0c47d77aced025f6bf352de WatchSource:0}: Error finding container c4d7142a013b73079df20ed90643fdab8ee7835ba0c47d77aced025f6bf352de: Status 404 returned error can't find the container with id c4d7142a013b73079df20ed90643fdab8ee7835ba0c47d77aced025f6bf352de Feb 21 22:41:55 crc kubenswrapper[4717]: I0221 22:41:55.154266 4717 scope.go:117] "RemoveContainer" containerID="48f67b8cd5e3a42e70ed8e3f33436cfe7ca461d8d3499ea3d0d27bad3ebaf44e" Feb 21 22:41:55 crc kubenswrapper[4717]: I0221 22:41:55.154348 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-hbw9t" Feb 21 22:41:56 crc kubenswrapper[4717]: I0221 22:41:56.011002 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d2afbf-20c0-468c-8e03-453e3d90b2a7" path="/var/lib/kubelet/pods/a8d2afbf-20c0-468c-8e03-453e3d90b2a7/volumes" Feb 21 22:41:56 crc kubenswrapper[4717]: I0221 22:41:56.166453 4717 generic.go:334] "Generic (PLEG): container finished" podID="0480db83-97bc-41da-81ea-891f9c65c997" containerID="7715f8e992ffef7bfd1baf0c3d90c77813714c758da698d8c88dfdbc9eb98b5a" exitCode=0 Feb 21 22:41:56 crc kubenswrapper[4717]: I0221 22:41:56.166504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" event={"ID":"0480db83-97bc-41da-81ea-891f9c65c997","Type":"ContainerDied","Data":"7715f8e992ffef7bfd1baf0c3d90c77813714c758da698d8c88dfdbc9eb98b5a"} Feb 21 22:41:56 crc kubenswrapper[4717]: I0221 22:41:56.166553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" event={"ID":"0480db83-97bc-41da-81ea-891f9c65c997","Type":"ContainerStarted","Data":"c4d7142a013b73079df20ed90643fdab8ee7835ba0c47d77aced025f6bf352de"} Feb 21 22:41:56 crc kubenswrapper[4717]: I0221 22:41:56.204788 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-4b2ld"] Feb 21 22:41:56 crc kubenswrapper[4717]: I0221 22:41:56.213407 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f2s4m/crc-debug-4b2ld"] Feb 21 22:41:57 crc kubenswrapper[4717]: I0221 22:41:57.920601 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.095313 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmnjp\" (UniqueName: \"kubernetes.io/projected/0480db83-97bc-41da-81ea-891f9c65c997-kube-api-access-rmnjp\") pod \"0480db83-97bc-41da-81ea-891f9c65c997\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.095389 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0480db83-97bc-41da-81ea-891f9c65c997-host\") pod \"0480db83-97bc-41da-81ea-891f9c65c997\" (UID: \"0480db83-97bc-41da-81ea-891f9c65c997\") " Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.095774 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0480db83-97bc-41da-81ea-891f9c65c997-host" (OuterVolumeSpecName: "host") pod "0480db83-97bc-41da-81ea-891f9c65c997" (UID: "0480db83-97bc-41da-81ea-891f9c65c997"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.102387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0480db83-97bc-41da-81ea-891f9c65c997-kube-api-access-rmnjp" (OuterVolumeSpecName: "kube-api-access-rmnjp") pod "0480db83-97bc-41da-81ea-891f9c65c997" (UID: "0480db83-97bc-41da-81ea-891f9c65c997"). InnerVolumeSpecName "kube-api-access-rmnjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.182891 4717 scope.go:117] "RemoveContainer" containerID="7715f8e992ffef7bfd1baf0c3d90c77813714c758da698d8c88dfdbc9eb98b5a" Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.182931 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/crc-debug-4b2ld" Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.197555 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmnjp\" (UniqueName: \"kubernetes.io/projected/0480db83-97bc-41da-81ea-891f9c65c997-kube-api-access-rmnjp\") on node \"crc\" DevicePath \"\"" Feb 21 22:41:58 crc kubenswrapper[4717]: I0221 22:41:58.197664 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0480db83-97bc-41da-81ea-891f9c65c997-host\") on node \"crc\" DevicePath \"\"" Feb 21 22:41:59 crc kubenswrapper[4717]: I0221 22:41:59.993669 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0480db83-97bc-41da-81ea-891f9c65c997" path="/var/lib/kubelet/pods/0480db83-97bc-41da-81ea-891f9c65c997/volumes" Feb 21 22:42:09 crc kubenswrapper[4717]: I0221 22:42:09.062469 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:42:09 crc kubenswrapper[4717]: I0221 22:42:09.063023 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.334985 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-df9cd9b6-x9br7_ace10c13-1b1b-4e3a-8f58-b8dab0c80704/barbican-api/0.log" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.483639 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-df9cd9b6-x9br7_ace10c13-1b1b-4e3a-8f58-b8dab0c80704/barbican-api-log/0.log" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.505951 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6bbdbc7546-gvgtj_71cac7a0-f790-43e5-87f9-d3862c20f857/barbican-keystone-listener/0.log" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.666018 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6bbdbc7546-gvgtj_71cac7a0-f790-43e5-87f9-d3862c20f857/barbican-keystone-listener-log/0.log" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.684781 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6fbd665b5-2sdwf_30134acb-a272-4da8-a2b6-683e431f593e/barbican-worker/0.log" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.834043 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6fbd665b5-2sdwf_30134acb-a272-4da8-a2b6-683e431f593e/barbican-worker-log/0.log" Feb 21 22:42:15 crc kubenswrapper[4717]: I0221 22:42:15.915575 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-twq55_da6e1269-a5c6-4f39-8d0a-b544de9522ba/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.083756 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/ceilometer-central-agent/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.111704 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/ceilometer-notification-agent/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.126751 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/proxy-httpd/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.289019 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/sg-core/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.329083 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_db0133ac-ab76-4b9d-a3e9-e55a095f919a/cinder-api/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.341710 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_db0133ac-ab76-4b9d-a3e9-e55a095f919a/cinder-api-log/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.535305 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9955b361-63c8-42bb-9efc-7ab0b3150904/cinder-scheduler/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.576465 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9955b361-63c8-42bb-9efc-7ab0b3150904/probe/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.733029 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk_79339bc9-6d8a-4fe5-ba8d-37643afe6d98/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.770661 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6hp66_ce64ba17-432c-46d7-86f2-33cc62514604/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:16 crc kubenswrapper[4717]: I0221 22:42:16.948159 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-q56p6_ba43f982-ee7f-4e48-a144-0c6d5c54c5a1/init/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.271709 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-q56p6_ba43f982-ee7f-4e48-a144-0c6d5c54c5a1/init/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.316848 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm_674f8569-62c3-477e-85af-13befe292f49/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.339760 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-q56p6_ba43f982-ee7f-4e48-a144-0c6d5c54c5a1/dnsmasq-dns/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.510957 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a8060c80-2f4b-4099-b5fd-841fadcdb329/glance-httpd/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.546208 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a8060c80-2f4b-4099-b5fd-841fadcdb329/glance-log/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.681145 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_128a5f99-d2a9-4551-8fd0-45efc6017dab/glance-httpd/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.750536 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_128a5f99-d2a9-4551-8fd0-45efc6017dab/glance-log/0.log" Feb 21 22:42:17 crc kubenswrapper[4717]: I0221 22:42:17.898146 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86b8dffbf6-mztpd_4e230efb-55a4-4e7f-9d9a-cc61d3123eab/horizon/0.log" Feb 21 22:42:18 crc kubenswrapper[4717]: I0221 22:42:18.064612 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j6gml_6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:18 crc kubenswrapper[4717]: I0221 22:42:18.218249 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86b8dffbf6-mztpd_4e230efb-55a4-4e7f-9d9a-cc61d3123eab/horizon-log/0.log" Feb 21 22:42:18 crc kubenswrapper[4717]: I0221 22:42:18.313981 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ml46s_0fcb6379-1d8c-44c9-8e50-10c52e40abcd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:18 crc kubenswrapper[4717]: I0221 22:42:18.548254 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b/kube-state-metrics/0.log" Feb 21 22:42:18 crc kubenswrapper[4717]: I0221 22:42:18.557444 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-86d46bb596-pj8cr_110e5c1e-4f14-4ab1-a0e0-f54dec9095a2/keystone-api/0.log" Feb 21 22:42:18 crc kubenswrapper[4717]: I0221 22:42:18.711996 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xpj84_4dac73d5-a471-44a7-a91a-3422e09c7bb0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:19 crc kubenswrapper[4717]: I0221 22:42:19.103064 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-677cdf8c9f-j2vl7_5aad4222-11d4-4bb0-9a90-b9924339c70e/neutron-api/0.log" Feb 21 22:42:19 crc kubenswrapper[4717]: I0221 22:42:19.140588 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-677cdf8c9f-j2vl7_5aad4222-11d4-4bb0-9a90-b9924339c70e/neutron-httpd/0.log" Feb 21 22:42:19 crc kubenswrapper[4717]: I0221 22:42:19.383016 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx_788c0818-654d-4001-a3ab-06c9dbd10592/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:19 crc kubenswrapper[4717]: I0221 22:42:19.698636 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3b045abf-1d97-45e2-a8ed-ed13aedc19f7/nova-api-log/0.log" Feb 21 22:42:19 crc kubenswrapper[4717]: I0221 22:42:19.818159 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1030ba56-81a8-4d0d-8e3d-c17779adcac6/nova-cell0-conductor-conductor/0.log" Feb 21 22:42:19 crc kubenswrapper[4717]: I0221 22:42:19.937338 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3b045abf-1d97-45e2-a8ed-ed13aedc19f7/nova-api-api/0.log" Feb 21 22:42:20 crc kubenswrapper[4717]: I0221 22:42:20.020968 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5e40eb10-e007-4fc0-97ff-de455effb430/nova-cell1-conductor-conductor/0.log" Feb 21 22:42:20 crc kubenswrapper[4717]: I0221 22:42:20.136328 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7a7413e9-1833-491d-8d42-0ac926edea33/nova-cell1-novncproxy-novncproxy/0.log" Feb 21 22:42:20 crc kubenswrapper[4717]: I0221 22:42:20.265766 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7l5ng_0fa98805-0ef5-463d-9ae3-1a66efcb9b0c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:20 crc kubenswrapper[4717]: I0221 22:42:20.460435 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e8a58dd-e035-414e-b133-160ea01477aa/nova-metadata-log/0.log" Feb 21 22:42:20 crc kubenswrapper[4717]: I0221 22:42:20.824836 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4a73e4bf-8575-43f8-bfff-35b8ca593732/mysql-bootstrap/0.log" Feb 21 22:42:20 crc kubenswrapper[4717]: I0221 22:42:20.863081 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_14550721-34cc-41f8-a5f0-e15e73cf2983/nova-scheduler-scheduler/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.090401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4a73e4bf-8575-43f8-bfff-35b8ca593732/galera/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.123058 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4a73e4bf-8575-43f8-bfff-35b8ca593732/mysql-bootstrap/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.354498 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e49285f0-879f-40db-8eb9-2e8e18a87bb7/mysql-bootstrap/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.451387 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e8a58dd-e035-414e-b133-160ea01477aa/nova-metadata-metadata/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.457723 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e49285f0-879f-40db-8eb9-2e8e18a87bb7/mysql-bootstrap/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.464875 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e49285f0-879f-40db-8eb9-2e8e18a87bb7/galera/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.691455 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_756781b7-938f-4654-8109-725420287d7b/openstackclient/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.772830 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-828xd_4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1/ovn-controller/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.885286 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kllk6_d770b9d0-2378-4b10-bf5a-b7e91c6b3843/openstack-network-exporter/0.log" Feb 21 22:42:21 crc kubenswrapper[4717]: I0221 22:42:21.973249 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovsdb-server-init/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.189987 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovsdb-server/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.284433 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovs-vswitchd/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.307889 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovsdb-server-init/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.425833 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dvvlx_cb09f299-8779-421d-a58f-dd16db2daadb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.518796 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_34cc3509-6f63-43c2-86a3-284360464284/openstack-network-exporter/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.573569 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_34cc3509-6f63-43c2-86a3-284360464284/ovn-northd/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.659129 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dcfdad72-66d7-4087-b0c3-4cb1925565a1/openstack-network-exporter/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.710168 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dcfdad72-66d7-4087-b0c3-4cb1925565a1/ovsdbserver-nb/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.897510 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_75f38ef9-3fc9-428a-8364-96c3938d69e5/openstack-network-exporter/0.log" Feb 21 22:42:22 crc kubenswrapper[4717]: I0221 22:42:22.949452 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_75f38ef9-3fc9-428a-8364-96c3938d69e5/ovsdbserver-sb/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.079950 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f5bdf5f76-9rjst_131cec55-efe0-49f9-ad5e-cfbca687c941/placement-api/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.151468 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f5bdf5f76-9rjst_131cec55-efe0-49f9-ad5e-cfbca687c941/placement-log/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.207976 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5547796-04e1-40e3-aa4a-a1aa936efcda/setup-container/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.393259 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5547796-04e1-40e3-aa4a-a1aa936efcda/setup-container/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.429186 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5547796-04e1-40e3-aa4a-a1aa936efcda/rabbitmq/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.553060 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38ce8ac2-d776-449b-89d8-3e9a853a8f44/setup-container/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.596937 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38ce8ac2-d776-449b-89d8-3e9a853a8f44/setup-container/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.653355 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38ce8ac2-d776-449b-89d8-3e9a853a8f44/rabbitmq/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.778952 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5_849e3b78-1693-47ad-9fc7-63c7a188d53e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:23 crc kubenswrapper[4717]: I0221 22:42:23.869710 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tqqzt_49b31634-aebd-4b3a-a1a3-f7d3e06782cf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.255761 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pslvp_d1311e18-5b26-49a3-86e3-481b3f9e5b03/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.258617 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg_331acbed-5028-4fdb-84a4-b105805863b9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.509525 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gshck_dee1c8dd-766e-41de-8631-cfc7d23a7681/ssh-known-hosts-edpm-deployment/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.628181 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4dc8df6c-b88lw_b1949ec1-5153-4003-b960-68a8f126b72d/proxy-server/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.658872 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4dc8df6c-b88lw_b1949ec1-5153-4003-b960-68a8f126b72d/proxy-httpd/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.725114 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5zc8b_9d1b5d67-1e8c-4c1f-a6a3-9634827165f8/swift-ring-rebalance/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.825251 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-auditor/0.log" Feb 21 22:42:24 crc kubenswrapper[4717]: I0221 22:42:24.936705 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-reaper/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.014090 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-replicator/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.145741 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-auditor/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.155311 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-replicator/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.161039 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-server/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.250375 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-server/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.365825 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-auditor/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.378961 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-expirer/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.389388 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-updater/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.521014 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-replicator/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.590782 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-updater/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.612349 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-server/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.652614 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/rsync/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.736971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/swift-recon-cron/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.953601 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35/tempest-tests-tempest-tests-runner/0.log" Feb 21 22:42:25 crc kubenswrapper[4717]: I0221 22:42:25.954163 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2_6890db4e-d63d-4b19-87aa-b5186b85ece1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:26 crc kubenswrapper[4717]: I0221 22:42:26.157238 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_943cf36f-ab88-4c12-a0c6-455facbc74ad/test-operator-logs-container/0.log" Feb 21 22:42:26 crc kubenswrapper[4717]: I0221 22:42:26.196992 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx_2eefd54d-a632-4fd6-a45c-e12e1f810d4a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:42:32 crc kubenswrapper[4717]: I0221 22:42:32.933810 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_85a966c3-05cc-49d0-ae99-0c774c67e89d/memcached/0.log" Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.062430 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.063155 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.063245 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.064098 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccc78a7a4117f5c2e21732ec2593c73a83c28b2a3108d6d9f273300c9f08f15e"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.064189 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://ccc78a7a4117f5c2e21732ec2593c73a83c28b2a3108d6d9f273300c9f08f15e" gracePeriod=600 Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.546935 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="ccc78a7a4117f5c2e21732ec2593c73a83c28b2a3108d6d9f273300c9f08f15e" exitCode=0 Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.547001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"ccc78a7a4117f5c2e21732ec2593c73a83c28b2a3108d6d9f273300c9f08f15e"} Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.547355 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac"} Feb 21 22:42:39 crc kubenswrapper[4717]: I0221 22:42:39.547378 4717 scope.go:117] "RemoveContainer" containerID="904feb7cad64a27972fc7f06791a53b981eabc89e1858d015eec79551caed674" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.003081 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/util/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.188420 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/util/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.203948 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/pull/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.248962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/pull/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.424906 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/util/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.439721 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/extract/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.473170 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/pull/0.log" Feb 21 22:42:51 crc kubenswrapper[4717]: I0221 22:42:51.959713 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-lgrct_4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6/manager/0.log" Feb 21 22:42:52 crc kubenswrapper[4717]: I0221 22:42:52.248542 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-blpgd_8cb19ba1-4432-41e7-afee-6fccd02f8564/manager/0.log" Feb 21 22:42:52 crc kubenswrapper[4717]: I0221 22:42:52.418427 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-mwvrn_1c89580c-c289-4ca5-b394-a85fa285dc30/manager/0.log" Feb 21 22:42:52 crc kubenswrapper[4717]: I0221 22:42:52.645057 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-qqrnp_5b8a35aa-e7ad-4103-b3db-1011411811db/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.081926 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-fkch5_85ba4b92-6749-498a-b112-db89d6856988/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.152944 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-klhlb_c5916af5-fc6c-4473-aafd-5331043ac1d8/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.230257 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-pb7k5_ff6fe6a4-86ca-4723-915d-b69be63387b6/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.390276 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mflb2_adb80d2f-050a-47f9-afe2-46cd5876e640/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.543923 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-66ssv_8c916c65-714b-4f8d-b551-c35239deab87/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.750349 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-96s5h_825c4fa5-a334-48b9-9ae0-583beb7e6a6b/manager/0.log" Feb 21 22:42:53 crc kubenswrapper[4717]: I0221 22:42:53.924504 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-vzfhh_5baed11e-00fc-4c09-8a82-fb761682244e/manager/0.log" Feb 21 22:42:54 crc kubenswrapper[4717]: I0221 22:42:54.100757 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-7glks_d839bd2c-8b12-4d02-a6b5-0399f3ded9fd/manager/0.log" Feb 21 22:42:54 crc kubenswrapper[4717]: I0221 22:42:54.175420 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-8mw2x_a2620c81-a9f0-4d4c-b281-5e5effb23419/manager/0.log" Feb 21 22:42:54 crc kubenswrapper[4717]: I0221 22:42:54.327793 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg_6d86a5a0-240a-4b65-af2b-6a5d91d95744/manager/0.log" Feb 21 22:42:54 crc kubenswrapper[4717]: I0221 22:42:54.809802 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5ccb695f5f-bb64r_29923aad-fe1a-464c-8f19-dc10ef9e4eaa/operator/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.072512 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hs5wv_0d953b3d-37a2-403a-bba7-369dc024f173/registry-server/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.105544 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-gldvl_4b328324-d3f1-4de9-b5b0-fb28bd7dfedd/manager/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.333002 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-xs5xq_3204092b-362c-42ed-ab07-3db2d36d32e5/manager/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.550594 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-thgdl_8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d/operator/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.557359 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-b5bzj_7a130695-4494-482a-b4fb-4703071fd28f/manager/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.865337 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-6vmk8_97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e/manager/0.log" Feb 21 22:42:55 crc kubenswrapper[4717]: I0221 22:42:55.917571 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-j6mms_a8cafe00-f55d-4444-ae31-827ac956b47c/manager/0.log" Feb 21 22:42:56 crc kubenswrapper[4717]: I0221 22:42:56.095399 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-thwwt_037ce2e5-e940-4172-80f4-f3d738a9d363/manager/0.log" Feb 21 22:42:56 crc kubenswrapper[4717]: I0221 22:42:56.188907 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85dff9d968-589dj_b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8/manager/0.log" Feb 21 22:42:58 crc kubenswrapper[4717]: I0221 22:42:58.230306 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-877lb_14e06d52-8282-4fcd-9cec-6c29a6336057/manager/0.log" Feb 21 22:43:15 crc kubenswrapper[4717]: I0221 22:43:15.957211 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r68hh_ff47caaa-cf37-40ea-8c6c-457189a5432b/control-plane-machine-set-operator/0.log" Feb 21 22:43:16 crc kubenswrapper[4717]: I0221 22:43:16.151680 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-84tzn_56758c2e-648d-41fe-8758-439f0070d150/machine-api-operator/0.log" Feb 21 22:43:16 crc kubenswrapper[4717]: I0221 22:43:16.167981 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-84tzn_56758c2e-648d-41fe-8758-439f0070d150/kube-rbac-proxy/0.log" Feb 21 22:43:29 crc kubenswrapper[4717]: I0221 22:43:29.077610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tmtcj_36b94939-dd01-40b9-a23b-6408a2cd36e8/cert-manager-controller/0.log" Feb 21 22:43:29 crc kubenswrapper[4717]: I0221 22:43:29.239453 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-pwd5d_11020ca7-c11f-4b18-a5f1-e1ab7bc148d2/cert-manager-cainjector/0.log" Feb 21 22:43:29 crc kubenswrapper[4717]: I0221 22:43:29.299136 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vkn9j_86b0523d-fba9-48d5-aaa6-33682ae7336a/cert-manager-webhook/0.log" Feb 21 22:43:42 crc kubenswrapper[4717]: I0221 22:43:42.318045 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-sdlq7_bef85b90-a8e3-4dbe-b0e7-f57e585bbc15/nmstate-console-plugin/0.log" Feb 21 22:43:42 crc kubenswrapper[4717]: I0221 22:43:42.461227 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wls9d_412a89db-473e-4710-ab4a-ecda68d76787/nmstate-handler/0.log" Feb 21 22:43:42 crc kubenswrapper[4717]: I0221 22:43:42.545084 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-f4sk4_3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1/nmstate-metrics/0.log" Feb 21 22:43:42 crc kubenswrapper[4717]: I0221 22:43:42.552699 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-f4sk4_3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1/kube-rbac-proxy/0.log" Feb 21 22:43:42 crc kubenswrapper[4717]: I0221 22:43:42.683869 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-6z5b8_4074f791-7fb0-4f78-96e7-e926cebfab66/nmstate-operator/0.log" Feb 21 22:43:42 crc kubenswrapper[4717]: I0221 22:43:42.907727 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-g8tr4_68767c95-c372-4de6-bab2-8eaae4cb37b3/nmstate-webhook/0.log" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.315540 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-twc8s"] Feb 21 22:43:48 crc kubenswrapper[4717]: E0221 22:43:48.316459 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0480db83-97bc-41da-81ea-891f9c65c997" containerName="container-00" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.316471 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0480db83-97bc-41da-81ea-891f9c65c997" containerName="container-00" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.316666 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0480db83-97bc-41da-81ea-891f9c65c997" containerName="container-00" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.317940 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.338541 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twc8s"] Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.399623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-utilities\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.399969 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfp9z\" (UniqueName: \"kubernetes.io/projected/9e49c29e-6b47-4874-8c4f-52b6259c57e4-kube-api-access-vfp9z\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.400093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-catalog-content\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.501681 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-catalog-content\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.501844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-utilities\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.501950 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfp9z\" (UniqueName: \"kubernetes.io/projected/9e49c29e-6b47-4874-8c4f-52b6259c57e4-kube-api-access-vfp9z\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.502186 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-catalog-content\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.502477 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-utilities\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.523431 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfp9z\" (UniqueName: \"kubernetes.io/projected/9e49c29e-6b47-4874-8c4f-52b6259c57e4-kube-api-access-vfp9z\") pod \"redhat-marketplace-twc8s\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:48 crc kubenswrapper[4717]: I0221 22:43:48.640800 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:49 crc kubenswrapper[4717]: I0221 22:43:49.103691 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twc8s"] Feb 21 22:43:49 crc kubenswrapper[4717]: I0221 22:43:49.192621 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twc8s" event={"ID":"9e49c29e-6b47-4874-8c4f-52b6259c57e4","Type":"ContainerStarted","Data":"dee01df4a4bd44cb00e153ac436694ea790dd6413e6a66a86f5cc76ddfc3b45f"} Feb 21 22:43:50 crc kubenswrapper[4717]: I0221 22:43:50.201072 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerID="460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2" exitCode=0 Feb 21 22:43:50 crc kubenswrapper[4717]: I0221 22:43:50.201120 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twc8s" event={"ID":"9e49c29e-6b47-4874-8c4f-52b6259c57e4","Type":"ContainerDied","Data":"460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2"} Feb 21 22:43:51 crc kubenswrapper[4717]: I0221 22:43:51.215453 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerID="1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff" exitCode=0 Feb 21 22:43:51 crc kubenswrapper[4717]: I0221 22:43:51.215555 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twc8s" event={"ID":"9e49c29e-6b47-4874-8c4f-52b6259c57e4","Type":"ContainerDied","Data":"1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff"} Feb 21 22:43:52 crc kubenswrapper[4717]: I0221 22:43:52.253400 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twc8s" event={"ID":"9e49c29e-6b47-4874-8c4f-52b6259c57e4","Type":"ContainerStarted","Data":"9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361"} Feb 21 22:43:52 crc kubenswrapper[4717]: I0221 22:43:52.284194 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-twc8s" podStartSLOduration=2.888528643 podStartE2EDuration="4.284164766s" podCreationTimestamp="2026-02-21 22:43:48 +0000 UTC" firstStartedPulling="2026-02-21 22:43:50.204838579 +0000 UTC m=+3444.986372201" lastFinishedPulling="2026-02-21 22:43:51.600474692 +0000 UTC m=+3446.382008324" observedRunningTime="2026-02-21 22:43:52.282379343 +0000 UTC m=+3447.063912985" watchObservedRunningTime="2026-02-21 22:43:52.284164766 +0000 UTC m=+3447.065698398" Feb 21 22:43:58 crc kubenswrapper[4717]: I0221 22:43:58.641511 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:58 crc kubenswrapper[4717]: I0221 22:43:58.642150 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:58 crc kubenswrapper[4717]: I0221 22:43:58.698206 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:59 crc kubenswrapper[4717]: I0221 22:43:59.395906 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:43:59 crc kubenswrapper[4717]: I0221 22:43:59.601648 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twc8s"] Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.328337 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-twc8s" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="registry-server" containerID="cri-o://9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361" gracePeriod=2 Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.807326 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.857444 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-utilities\") pod \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.857673 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfp9z\" (UniqueName: \"kubernetes.io/projected/9e49c29e-6b47-4874-8c4f-52b6259c57e4-kube-api-access-vfp9z\") pod \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.857759 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-catalog-content\") pod \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\" (UID: \"9e49c29e-6b47-4874-8c4f-52b6259c57e4\") " Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.858561 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-utilities" (OuterVolumeSpecName: "utilities") pod "9e49c29e-6b47-4874-8c4f-52b6259c57e4" (UID: "9e49c29e-6b47-4874-8c4f-52b6259c57e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.865110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e49c29e-6b47-4874-8c4f-52b6259c57e4-kube-api-access-vfp9z" (OuterVolumeSpecName: "kube-api-access-vfp9z") pod "9e49c29e-6b47-4874-8c4f-52b6259c57e4" (UID: "9e49c29e-6b47-4874-8c4f-52b6259c57e4"). InnerVolumeSpecName "kube-api-access-vfp9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.880053 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e49c29e-6b47-4874-8c4f-52b6259c57e4" (UID: "9e49c29e-6b47-4874-8c4f-52b6259c57e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.960142 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfp9z\" (UniqueName: \"kubernetes.io/projected/9e49c29e-6b47-4874-8c4f-52b6259c57e4-kube-api-access-vfp9z\") on node \"crc\" DevicePath \"\"" Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.960200 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:44:01 crc kubenswrapper[4717]: I0221 22:44:01.960216 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e49c29e-6b47-4874-8c4f-52b6259c57e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.338949 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerID="9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361" exitCode=0 Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.339211 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twc8s" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.339256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twc8s" event={"ID":"9e49c29e-6b47-4874-8c4f-52b6259c57e4","Type":"ContainerDied","Data":"9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361"} Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.339296 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twc8s" event={"ID":"9e49c29e-6b47-4874-8c4f-52b6259c57e4","Type":"ContainerDied","Data":"dee01df4a4bd44cb00e153ac436694ea790dd6413e6a66a86f5cc76ddfc3b45f"} Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.339316 4717 scope.go:117] "RemoveContainer" containerID="9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.377784 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-twc8s"] Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.387181 4717 scope.go:117] "RemoveContainer" containerID="1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.391680 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-twc8s"] Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.404721 4717 scope.go:117] "RemoveContainer" containerID="460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.454477 4717 scope.go:117] "RemoveContainer" containerID="9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361" Feb 21 22:44:02 crc kubenswrapper[4717]: E0221 22:44:02.454807 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361\": container with ID starting with 9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361 not found: ID does not exist" containerID="9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.454893 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361"} err="failed to get container status \"9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361\": rpc error: code = NotFound desc = could not find container \"9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361\": container with ID starting with 9701f44b2975c577f3f38b5b9bf19be5884b61275587401ddf6e09059dabc361 not found: ID does not exist" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.454929 4717 scope.go:117] "RemoveContainer" containerID="1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff" Feb 21 22:44:02 crc kubenswrapper[4717]: E0221 22:44:02.455260 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff\": container with ID starting with 1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff not found: ID does not exist" containerID="1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.455290 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff"} err="failed to get container status \"1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff\": rpc error: code = NotFound desc = could not find container \"1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff\": container with ID starting with 1187f88d38efdf202c8d509455ac78c3643954972a8543aaf5276e46896283ff not found: ID does not exist" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.455313 4717 scope.go:117] "RemoveContainer" containerID="460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2" Feb 21 22:44:02 crc kubenswrapper[4717]: E0221 22:44:02.455493 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2\": container with ID starting with 460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2 not found: ID does not exist" containerID="460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2" Feb 21 22:44:02 crc kubenswrapper[4717]: I0221 22:44:02.455518 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2"} err="failed to get container status \"460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2\": rpc error: code = NotFound desc = could not find container \"460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2\": container with ID starting with 460fc976b4f36c57320c789411812d305b8e52fcd6fe2ecccf8e0cedfd9d5bf2 not found: ID does not exist" Feb 21 22:44:04 crc kubenswrapper[4717]: I0221 22:44:04.004080 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" path="/var/lib/kubelet/pods/9e49c29e-6b47-4874-8c4f-52b6259c57e4/volumes" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.009988 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-gq99r_75d38a7a-9bcd-49d6-812c-6d451c933f87/kube-rbac-proxy/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.179617 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-gq99r_75d38a7a-9bcd-49d6-812c-6d451c933f87/controller/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.217557 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.339535 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.382192 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.389014 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.399068 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.600194 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.614966 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.629336 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.655307 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.819539 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.860716 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.864652 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:44:13 crc kubenswrapper[4717]: I0221 22:44:13.889652 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/controller/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.059936 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/kube-rbac-proxy/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.108608 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/frr-metrics/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.127299 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/kube-rbac-proxy-frr/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.240156 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/reloader/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.345377 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-bklbl_0f79dc1a-f5c3-4b0b-827f-1aeb3729478e/frr-k8s-webhook-server/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.595100 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-644bd788d-hj4nv_2c9c10bb-4b07-4f5e-af44-0353e53010d1/manager/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.690481 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cdb748cc4-h69fk_d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95/webhook-server/0.log" Feb 21 22:44:14 crc kubenswrapper[4717]: I0221 22:44:14.885317 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r4xwk_1063cb22-b437-4152-913e-9673c0a51b7a/kube-rbac-proxy/0.log" Feb 21 22:44:15 crc kubenswrapper[4717]: I0221 22:44:15.285626 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/frr/0.log" Feb 21 22:44:15 crc kubenswrapper[4717]: I0221 22:44:15.339507 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r4xwk_1063cb22-b437-4152-913e-9673c0a51b7a/speaker/0.log" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.763022 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9dp45"] Feb 21 22:44:20 crc kubenswrapper[4717]: E0221 22:44:20.763766 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="extract-utilities" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.763778 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="extract-utilities" Feb 21 22:44:20 crc kubenswrapper[4717]: E0221 22:44:20.763818 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="extract-content" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.763825 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="extract-content" Feb 21 22:44:20 crc kubenswrapper[4717]: E0221 22:44:20.763834 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="registry-server" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.763840 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="registry-server" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.764008 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e49c29e-6b47-4874-8c4f-52b6259c57e4" containerName="registry-server" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.765255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.796488 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9dp45"] Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.873165 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-utilities\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.873281 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbmfr\" (UniqueName: \"kubernetes.io/projected/c438b4bf-6e4e-433c-b952-81825ae52873-kube-api-access-vbmfr\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.873347 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-catalog-content\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.975146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-catalog-content\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.975274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-utilities\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.975319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbmfr\" (UniqueName: \"kubernetes.io/projected/c438b4bf-6e4e-433c-b952-81825ae52873-kube-api-access-vbmfr\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.976246 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-utilities\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:20 crc kubenswrapper[4717]: I0221 22:44:20.976325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-catalog-content\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:21 crc kubenswrapper[4717]: I0221 22:44:21.004321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbmfr\" (UniqueName: \"kubernetes.io/projected/c438b4bf-6e4e-433c-b952-81825ae52873-kube-api-access-vbmfr\") pod \"redhat-operators-9dp45\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:21 crc kubenswrapper[4717]: I0221 22:44:21.144647 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:21 crc kubenswrapper[4717]: I0221 22:44:21.627656 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9dp45"] Feb 21 22:44:22 crc kubenswrapper[4717]: I0221 22:44:22.532713 4717 generic.go:334] "Generic (PLEG): container finished" podID="c438b4bf-6e4e-433c-b952-81825ae52873" containerID="c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1" exitCode=0 Feb 21 22:44:22 crc kubenswrapper[4717]: I0221 22:44:22.532963 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerDied","Data":"c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1"} Feb 21 22:44:22 crc kubenswrapper[4717]: I0221 22:44:22.533259 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerStarted","Data":"30210ae9ece23632ffbbc758451488e18107630111a5bf492272917ab4473fa4"} Feb 21 22:44:23 crc kubenswrapper[4717]: I0221 22:44:23.547154 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerStarted","Data":"52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7"} Feb 21 22:44:24 crc kubenswrapper[4717]: I0221 22:44:24.556956 4717 generic.go:334] "Generic (PLEG): container finished" podID="c438b4bf-6e4e-433c-b952-81825ae52873" containerID="52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7" exitCode=0 Feb 21 22:44:24 crc kubenswrapper[4717]: I0221 22:44:24.557007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerDied","Data":"52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7"} Feb 21 22:44:25 crc kubenswrapper[4717]: I0221 22:44:25.567095 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerStarted","Data":"c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717"} Feb 21 22:44:25 crc kubenswrapper[4717]: I0221 22:44:25.591343 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9dp45" podStartSLOduration=2.943227044 podStartE2EDuration="5.591321938s" podCreationTimestamp="2026-02-21 22:44:20 +0000 UTC" firstStartedPulling="2026-02-21 22:44:22.534589862 +0000 UTC m=+3477.316123484" lastFinishedPulling="2026-02-21 22:44:25.182684716 +0000 UTC m=+3479.964218378" observedRunningTime="2026-02-21 22:44:25.58465345 +0000 UTC m=+3480.366187072" watchObservedRunningTime="2026-02-21 22:44:25.591321938 +0000 UTC m=+3480.372855560" Feb 21 22:44:29 crc kubenswrapper[4717]: I0221 22:44:29.737998 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/util/0.log" Feb 21 22:44:29 crc kubenswrapper[4717]: I0221 22:44:29.923112 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/pull/0.log" Feb 21 22:44:29 crc kubenswrapper[4717]: I0221 22:44:29.935416 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/util/0.log" Feb 21 22:44:29 crc kubenswrapper[4717]: I0221 22:44:29.966940 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/pull/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.157366 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/extract/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.191254 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/util/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.262019 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/pull/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.367511 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-utilities/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.504988 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-utilities/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.553288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-content/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.553447 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-content/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.694528 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-utilities/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.696131 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-content/0.log" Feb 21 22:44:30 crc kubenswrapper[4717]: I0221 22:44:30.914845 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-utilities/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.091967 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-utilities/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.132441 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-content/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.145561 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.145599 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.155729 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-content/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.357003 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-utilities/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.432276 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-content/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.710624 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/util/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.929371 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/registry-server/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.949171 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/util/0.log" Feb 21 22:44:31 crc kubenswrapper[4717]: I0221 22:44:31.979122 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/pull/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.014830 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/registry-server/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.123420 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/pull/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.216545 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9dp45" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="registry-server" probeResult="failure" output=< Feb 21 22:44:32 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 22:44:32 crc kubenswrapper[4717]: > Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.245668 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/util/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.276266 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/extract/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.277003 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/pull/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.469876 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kd64g_5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d/marketplace-operator/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.471720 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-utilities/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.645216 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-content/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.675028 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-content/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.680495 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-utilities/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.811723 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-content/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.815514 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-utilities/0.log" Feb 21 22:44:32 crc kubenswrapper[4717]: I0221 22:44:32.950112 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/registry-server/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.002998 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/extract-utilities/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.261057 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/extract-utilities/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.261108 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/extract-content/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.301375 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/extract-content/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.433293 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/extract-content/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.437946 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/registry-server/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.439838 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dp45_c438b4bf-6e4e-433c-b952-81825ae52873/extract-utilities/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.595645 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-utilities/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.782679 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-content/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.799475 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-content/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.806935 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-utilities/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.930004 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-content/0.log" Feb 21 22:44:33 crc kubenswrapper[4717]: I0221 22:44:33.935807 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-utilities/0.log" Feb 21 22:44:34 crc kubenswrapper[4717]: I0221 22:44:34.306077 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/registry-server/0.log" Feb 21 22:44:39 crc kubenswrapper[4717]: I0221 22:44:39.063296 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:44:39 crc kubenswrapper[4717]: I0221 22:44:39.063814 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:44:41 crc kubenswrapper[4717]: I0221 22:44:41.201642 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:41 crc kubenswrapper[4717]: I0221 22:44:41.271510 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:41 crc kubenswrapper[4717]: I0221 22:44:41.450928 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9dp45"] Feb 21 22:44:42 crc kubenswrapper[4717]: I0221 22:44:42.721707 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9dp45" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="registry-server" containerID="cri-o://c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717" gracePeriod=2 Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.208673 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.313968 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbmfr\" (UniqueName: \"kubernetes.io/projected/c438b4bf-6e4e-433c-b952-81825ae52873-kube-api-access-vbmfr\") pod \"c438b4bf-6e4e-433c-b952-81825ae52873\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.314458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-utilities\") pod \"c438b4bf-6e4e-433c-b952-81825ae52873\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.314705 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-catalog-content\") pod \"c438b4bf-6e4e-433c-b952-81825ae52873\" (UID: \"c438b4bf-6e4e-433c-b952-81825ae52873\") " Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.315126 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-utilities" (OuterVolumeSpecName: "utilities") pod "c438b4bf-6e4e-433c-b952-81825ae52873" (UID: "c438b4bf-6e4e-433c-b952-81825ae52873"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.315356 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.335138 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c438b4bf-6e4e-433c-b952-81825ae52873-kube-api-access-vbmfr" (OuterVolumeSpecName: "kube-api-access-vbmfr") pod "c438b4bf-6e4e-433c-b952-81825ae52873" (UID: "c438b4bf-6e4e-433c-b952-81825ae52873"). InnerVolumeSpecName "kube-api-access-vbmfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.416890 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbmfr\" (UniqueName: \"kubernetes.io/projected/c438b4bf-6e4e-433c-b952-81825ae52873-kube-api-access-vbmfr\") on node \"crc\" DevicePath \"\"" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.443790 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c438b4bf-6e4e-433c-b952-81825ae52873" (UID: "c438b4bf-6e4e-433c-b952-81825ae52873"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.519025 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c438b4bf-6e4e-433c-b952-81825ae52873-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.731798 4717 generic.go:334] "Generic (PLEG): container finished" podID="c438b4bf-6e4e-433c-b952-81825ae52873" containerID="c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717" exitCode=0 Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.731844 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dp45" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.731889 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerDied","Data":"c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717"} Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.731966 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dp45" event={"ID":"c438b4bf-6e4e-433c-b952-81825ae52873","Type":"ContainerDied","Data":"30210ae9ece23632ffbbc758451488e18107630111a5bf492272917ab4473fa4"} Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.732003 4717 scope.go:117] "RemoveContainer" containerID="c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.756298 4717 scope.go:117] "RemoveContainer" containerID="52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.772623 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9dp45"] Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.780382 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9dp45"] Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.797063 4717 scope.go:117] "RemoveContainer" containerID="c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.828628 4717 scope.go:117] "RemoveContainer" containerID="c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717" Feb 21 22:44:43 crc kubenswrapper[4717]: E0221 22:44:43.829078 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717\": container with ID starting with c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717 not found: ID does not exist" containerID="c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.829120 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717"} err="failed to get container status \"c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717\": rpc error: code = NotFound desc = could not find container \"c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717\": container with ID starting with c1ed798bbe15e30e87f7691d1673109f4b1bf12e90302c08baf25f7c5e167717 not found: ID does not exist" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.829145 4717 scope.go:117] "RemoveContainer" containerID="52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7" Feb 21 22:44:43 crc kubenswrapper[4717]: E0221 22:44:43.829512 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7\": container with ID starting with 52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7 not found: ID does not exist" containerID="52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.829537 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7"} err="failed to get container status \"52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7\": rpc error: code = NotFound desc = could not find container \"52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7\": container with ID starting with 52811db7f7e73fdba9a4524aef8cc7d70ce7f9c700eba690c190bdddc5daffa7 not found: ID does not exist" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.829554 4717 scope.go:117] "RemoveContainer" containerID="c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1" Feb 21 22:44:43 crc kubenswrapper[4717]: E0221 22:44:43.829847 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1\": container with ID starting with c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1 not found: ID does not exist" containerID="c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1" Feb 21 22:44:43 crc kubenswrapper[4717]: I0221 22:44:43.829930 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1"} err="failed to get container status \"c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1\": rpc error: code = NotFound desc = could not find container \"c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1\": container with ID starting with c5f9e62514b5a66dcb0386a171833c7571b16180a4750d2120ac91407d2022d1 not found: ID does not exist" Feb 21 22:44:44 crc kubenswrapper[4717]: I0221 22:44:44.000276 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" path="/var/lib/kubelet/pods/c438b4bf-6e4e-433c-b952-81825ae52873/volumes" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.171628 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm"] Feb 21 22:45:00 crc kubenswrapper[4717]: E0221 22:45:00.172556 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="registry-server" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.172573 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="registry-server" Feb 21 22:45:00 crc kubenswrapper[4717]: E0221 22:45:00.172593 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="extract-utilities" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.172601 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="extract-utilities" Feb 21 22:45:00 crc kubenswrapper[4717]: E0221 22:45:00.172618 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="extract-content" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.172627 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="extract-content" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.172883 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c438b4bf-6e4e-433c-b952-81825ae52873" containerName="registry-server" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.173598 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.175717 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.180134 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.188472 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm"] Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.251260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39c91509-1d8a-4957-80a4-bc467b89364d-secret-volume\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.251394 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39c91509-1d8a-4957-80a4-bc467b89364d-config-volume\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.251488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvsf\" (UniqueName: \"kubernetes.io/projected/39c91509-1d8a-4957-80a4-bc467b89364d-kube-api-access-6cvsf\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.353590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39c91509-1d8a-4957-80a4-bc467b89364d-config-volume\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.353732 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvsf\" (UniqueName: \"kubernetes.io/projected/39c91509-1d8a-4957-80a4-bc467b89364d-kube-api-access-6cvsf\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.353810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39c91509-1d8a-4957-80a4-bc467b89364d-secret-volume\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.354648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39c91509-1d8a-4957-80a4-bc467b89364d-config-volume\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.360938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39c91509-1d8a-4957-80a4-bc467b89364d-secret-volume\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.371989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvsf\" (UniqueName: \"kubernetes.io/projected/39c91509-1d8a-4957-80a4-bc467b89364d-kube-api-access-6cvsf\") pod \"collect-profiles-29528565-f9tpm\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:00 crc kubenswrapper[4717]: I0221 22:45:00.492048 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:01 crc kubenswrapper[4717]: I0221 22:45:01.006484 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm"] Feb 21 22:45:01 crc kubenswrapper[4717]: I0221 22:45:01.928532 4717 generic.go:334] "Generic (PLEG): container finished" podID="39c91509-1d8a-4957-80a4-bc467b89364d" containerID="526629eddcb34e30b6fad540805ba5755013357840fbe8d1bf4e537a3776b1de" exitCode=0 Feb 21 22:45:01 crc kubenswrapper[4717]: I0221 22:45:01.928655 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" event={"ID":"39c91509-1d8a-4957-80a4-bc467b89364d","Type":"ContainerDied","Data":"526629eddcb34e30b6fad540805ba5755013357840fbe8d1bf4e537a3776b1de"} Feb 21 22:45:01 crc kubenswrapper[4717]: I0221 22:45:01.928954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" event={"ID":"39c91509-1d8a-4957-80a4-bc467b89364d","Type":"ContainerStarted","Data":"a2fed9b5ebe170eb8b2ddec716c17e3ecfbf6c1e33a7ae5cc9fd2ad8223f27c8"} Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.313066 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.410363 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cvsf\" (UniqueName: \"kubernetes.io/projected/39c91509-1d8a-4957-80a4-bc467b89364d-kube-api-access-6cvsf\") pod \"39c91509-1d8a-4957-80a4-bc467b89364d\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.410438 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39c91509-1d8a-4957-80a4-bc467b89364d-config-volume\") pod \"39c91509-1d8a-4957-80a4-bc467b89364d\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.410561 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39c91509-1d8a-4957-80a4-bc467b89364d-secret-volume\") pod \"39c91509-1d8a-4957-80a4-bc467b89364d\" (UID: \"39c91509-1d8a-4957-80a4-bc467b89364d\") " Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.411313 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c91509-1d8a-4957-80a4-bc467b89364d-config-volume" (OuterVolumeSpecName: "config-volume") pod "39c91509-1d8a-4957-80a4-bc467b89364d" (UID: "39c91509-1d8a-4957-80a4-bc467b89364d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.417066 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c91509-1d8a-4957-80a4-bc467b89364d-kube-api-access-6cvsf" (OuterVolumeSpecName: "kube-api-access-6cvsf") pod "39c91509-1d8a-4957-80a4-bc467b89364d" (UID: "39c91509-1d8a-4957-80a4-bc467b89364d"). InnerVolumeSpecName "kube-api-access-6cvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.418005 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39c91509-1d8a-4957-80a4-bc467b89364d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39c91509-1d8a-4957-80a4-bc467b89364d" (UID: "39c91509-1d8a-4957-80a4-bc467b89364d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.512236 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cvsf\" (UniqueName: \"kubernetes.io/projected/39c91509-1d8a-4957-80a4-bc467b89364d-kube-api-access-6cvsf\") on node \"crc\" DevicePath \"\"" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.512262 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39c91509-1d8a-4957-80a4-bc467b89364d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.512273 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39c91509-1d8a-4957-80a4-bc467b89364d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.949510 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" event={"ID":"39c91509-1d8a-4957-80a4-bc467b89364d","Type":"ContainerDied","Data":"a2fed9b5ebe170eb8b2ddec716c17e3ecfbf6c1e33a7ae5cc9fd2ad8223f27c8"} Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.949551 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29528565-f9tpm" Feb 21 22:45:03 crc kubenswrapper[4717]: I0221 22:45:03.949557 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2fed9b5ebe170eb8b2ddec716c17e3ecfbf6c1e33a7ae5cc9fd2ad8223f27c8" Feb 21 22:45:04 crc kubenswrapper[4717]: I0221 22:45:04.384377 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4"] Feb 21 22:45:04 crc kubenswrapper[4717]: I0221 22:45:04.396252 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29528520-whvw4"] Feb 21 22:45:06 crc kubenswrapper[4717]: I0221 22:45:06.006530 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5143bc63-52d5-4480-a59a-98f8fc364e53" path="/var/lib/kubelet/pods/5143bc63-52d5-4480-a59a-98f8fc364e53/volumes" Feb 21 22:45:09 crc kubenswrapper[4717]: I0221 22:45:09.063075 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:45:09 crc kubenswrapper[4717]: I0221 22:45:09.063689 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.491829 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8g5fq"] Feb 21 22:45:11 crc kubenswrapper[4717]: E0221 22:45:11.497441 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c91509-1d8a-4957-80a4-bc467b89364d" containerName="collect-profiles" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.497479 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c91509-1d8a-4957-80a4-bc467b89364d" containerName="collect-profiles" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.497985 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c91509-1d8a-4957-80a4-bc467b89364d" containerName="collect-profiles" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.500796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.526816 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8g5fq"] Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.580753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664jx\" (UniqueName: \"kubernetes.io/projected/56321f0a-82cc-4552-b5df-753a7c914b2f-kube-api-access-664jx\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.580812 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-utilities\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.581006 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-catalog-content\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.683386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-catalog-content\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.683532 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-664jx\" (UniqueName: \"kubernetes.io/projected/56321f0a-82cc-4552-b5df-753a7c914b2f-kube-api-access-664jx\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.683561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-utilities\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.683940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-catalog-content\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.684069 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-utilities\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.709606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-664jx\" (UniqueName: \"kubernetes.io/projected/56321f0a-82cc-4552-b5df-753a7c914b2f-kube-api-access-664jx\") pod \"community-operators-8g5fq\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:11 crc kubenswrapper[4717]: I0221 22:45:11.838516 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:12 crc kubenswrapper[4717]: I0221 22:45:12.394702 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8g5fq"] Feb 21 22:45:13 crc kubenswrapper[4717]: I0221 22:45:13.106267 4717 generic.go:334] "Generic (PLEG): container finished" podID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerID="a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4" exitCode=0 Feb 21 22:45:13 crc kubenswrapper[4717]: I0221 22:45:13.106377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8g5fq" event={"ID":"56321f0a-82cc-4552-b5df-753a7c914b2f","Type":"ContainerDied","Data":"a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4"} Feb 21 22:45:13 crc kubenswrapper[4717]: I0221 22:45:13.106471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8g5fq" event={"ID":"56321f0a-82cc-4552-b5df-753a7c914b2f","Type":"ContainerStarted","Data":"683ea8754b965137bfc78e5e2f7d2b0262f86585196139c3c35e97f753fbc399"} Feb 21 22:45:16 crc kubenswrapper[4717]: I0221 22:45:16.146857 4717 generic.go:334] "Generic (PLEG): container finished" podID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerID="6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b" exitCode=0 Feb 21 22:45:16 crc kubenswrapper[4717]: I0221 22:45:16.147007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8g5fq" event={"ID":"56321f0a-82cc-4552-b5df-753a7c914b2f","Type":"ContainerDied","Data":"6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b"} Feb 21 22:45:17 crc kubenswrapper[4717]: I0221 22:45:17.159346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8g5fq" event={"ID":"56321f0a-82cc-4552-b5df-753a7c914b2f","Type":"ContainerStarted","Data":"5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5"} Feb 21 22:45:17 crc kubenswrapper[4717]: I0221 22:45:17.189242 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8g5fq" podStartSLOduration=2.626563219 podStartE2EDuration="6.189184355s" podCreationTimestamp="2026-02-21 22:45:11 +0000 UTC" firstStartedPulling="2026-02-21 22:45:13.109820742 +0000 UTC m=+3527.891354404" lastFinishedPulling="2026-02-21 22:45:16.672441908 +0000 UTC m=+3531.453975540" observedRunningTime="2026-02-21 22:45:17.185726363 +0000 UTC m=+3531.967259995" watchObservedRunningTime="2026-02-21 22:45:17.189184355 +0000 UTC m=+3531.970718017" Feb 21 22:45:21 crc kubenswrapper[4717]: I0221 22:45:21.838728 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:21 crc kubenswrapper[4717]: I0221 22:45:21.839290 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:21 crc kubenswrapper[4717]: I0221 22:45:21.909284 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:22 crc kubenswrapper[4717]: I0221 22:45:22.285059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:22 crc kubenswrapper[4717]: I0221 22:45:22.340327 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8g5fq"] Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.249150 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8g5fq" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="registry-server" containerID="cri-o://5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5" gracePeriod=2 Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.761388 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.897538 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-664jx\" (UniqueName: \"kubernetes.io/projected/56321f0a-82cc-4552-b5df-753a7c914b2f-kube-api-access-664jx\") pod \"56321f0a-82cc-4552-b5df-753a7c914b2f\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.897721 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-utilities\") pod \"56321f0a-82cc-4552-b5df-753a7c914b2f\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.897784 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-catalog-content\") pod \"56321f0a-82cc-4552-b5df-753a7c914b2f\" (UID: \"56321f0a-82cc-4552-b5df-753a7c914b2f\") " Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.899117 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-utilities" (OuterVolumeSpecName: "utilities") pod "56321f0a-82cc-4552-b5df-753a7c914b2f" (UID: "56321f0a-82cc-4552-b5df-753a7c914b2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.905835 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56321f0a-82cc-4552-b5df-753a7c914b2f-kube-api-access-664jx" (OuterVolumeSpecName: "kube-api-access-664jx") pod "56321f0a-82cc-4552-b5df-753a7c914b2f" (UID: "56321f0a-82cc-4552-b5df-753a7c914b2f"). InnerVolumeSpecName "kube-api-access-664jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:45:24 crc kubenswrapper[4717]: I0221 22:45:24.975720 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56321f0a-82cc-4552-b5df-753a7c914b2f" (UID: "56321f0a-82cc-4552-b5df-753a7c914b2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.000736 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-664jx\" (UniqueName: \"kubernetes.io/projected/56321f0a-82cc-4552-b5df-753a7c914b2f-kube-api-access-664jx\") on node \"crc\" DevicePath \"\"" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.000772 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.000782 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56321f0a-82cc-4552-b5df-753a7c914b2f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.270996 4717 generic.go:334] "Generic (PLEG): container finished" podID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerID="5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5" exitCode=0 Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.271050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8g5fq" event={"ID":"56321f0a-82cc-4552-b5df-753a7c914b2f","Type":"ContainerDied","Data":"5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5"} Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.271087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8g5fq" event={"ID":"56321f0a-82cc-4552-b5df-753a7c914b2f","Type":"ContainerDied","Data":"683ea8754b965137bfc78e5e2f7d2b0262f86585196139c3c35e97f753fbc399"} Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.271111 4717 scope.go:117] "RemoveContainer" containerID="5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.271091 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8g5fq" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.302344 4717 scope.go:117] "RemoveContainer" containerID="6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.331216 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8g5fq"] Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.340363 4717 scope.go:117] "RemoveContainer" containerID="a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.340511 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8g5fq"] Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.387321 4717 scope.go:117] "RemoveContainer" containerID="5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5" Feb 21 22:45:25 crc kubenswrapper[4717]: E0221 22:45:25.389531 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5\": container with ID starting with 5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5 not found: ID does not exist" containerID="5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.389978 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5"} err="failed to get container status \"5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5\": rpc error: code = NotFound desc = could not find container \"5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5\": container with ID starting with 5631c33a0385ed819e03c6b232c8be3aad753ce9ae7b79cc5db246b66f0257d5 not found: ID does not exist" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.390155 4717 scope.go:117] "RemoveContainer" containerID="6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b" Feb 21 22:45:25 crc kubenswrapper[4717]: E0221 22:45:25.390826 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b\": container with ID starting with 6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b not found: ID does not exist" containerID="6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.390868 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b"} err="failed to get container status \"6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b\": rpc error: code = NotFound desc = could not find container \"6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b\": container with ID starting with 6a1537fdb8ce6a3bbfa1c796d31fc49db2f1700a5f127dd2ba89cd7d1e56be1b not found: ID does not exist" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.390889 4717 scope.go:117] "RemoveContainer" containerID="a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4" Feb 21 22:45:25 crc kubenswrapper[4717]: E0221 22:45:25.391295 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4\": container with ID starting with a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4 not found: ID does not exist" containerID="a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4" Feb 21 22:45:25 crc kubenswrapper[4717]: I0221 22:45:25.391418 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4"} err="failed to get container status \"a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4\": rpc error: code = NotFound desc = could not find container \"a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4\": container with ID starting with a44b3d72e8fde36f232bd1e7701325c6e419c582086553f71d9ccb96cb1bdea4 not found: ID does not exist" Feb 21 22:45:26 crc kubenswrapper[4717]: I0221 22:45:26.017747 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" path="/var/lib/kubelet/pods/56321f0a-82cc-4552-b5df-753a7c914b2f/volumes" Feb 21 22:45:37 crc kubenswrapper[4717]: I0221 22:45:37.118149 4717 scope.go:117] "RemoveContainer" containerID="5ee67d3501af8c06a273cf88a739df40b51172981e58d4472d7d3854414353cd" Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.066966 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.067449 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.067497 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.068182 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.068224 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" gracePeriod=600 Feb 21 22:45:39 crc kubenswrapper[4717]: E0221 22:45:39.196690 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.424803 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" exitCode=0 Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.424850 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac"} Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.424905 4717 scope.go:117] "RemoveContainer" containerID="ccc78a7a4117f5c2e21732ec2593c73a83c28b2a3108d6d9f273300c9f08f15e" Feb 21 22:45:39 crc kubenswrapper[4717]: I0221 22:45:39.425779 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:45:39 crc kubenswrapper[4717]: E0221 22:45:39.426149 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:45:51 crc kubenswrapper[4717]: I0221 22:45:51.979846 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:45:51 crc kubenswrapper[4717]: E0221 22:45:51.981607 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:46:03 crc kubenswrapper[4717]: I0221 22:46:03.977387 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:46:03 crc kubenswrapper[4717]: E0221 22:46:03.980750 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.953187 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nn47k"] Feb 21 22:46:10 crc kubenswrapper[4717]: E0221 22:46:10.955552 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="extract-utilities" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.955578 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="extract-utilities" Feb 21 22:46:10 crc kubenswrapper[4717]: E0221 22:46:10.955640 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="registry-server" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.955654 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="registry-server" Feb 21 22:46:10 crc kubenswrapper[4717]: E0221 22:46:10.955684 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="extract-content" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.955697 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="extract-content" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.956633 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="56321f0a-82cc-4552-b5df-753a7c914b2f" containerName="registry-server" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.959110 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:10 crc kubenswrapper[4717]: I0221 22:46:10.976995 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nn47k"] Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.037712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-catalog-content\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.038346 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjrm\" (UniqueName: \"kubernetes.io/projected/15716053-ad9e-433e-9e65-e2e2cfa61cad-kube-api-access-tpjrm\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.038426 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-utilities\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.140959 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjrm\" (UniqueName: \"kubernetes.io/projected/15716053-ad9e-433e-9e65-e2e2cfa61cad-kube-api-access-tpjrm\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.141121 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-utilities\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.141243 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-catalog-content\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.141851 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-utilities\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.142085 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-catalog-content\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.164615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjrm\" (UniqueName: \"kubernetes.io/projected/15716053-ad9e-433e-9e65-e2e2cfa61cad-kube-api-access-tpjrm\") pod \"certified-operators-nn47k\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.302646 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:11 crc kubenswrapper[4717]: I0221 22:46:11.815035 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nn47k"] Feb 21 22:46:12 crc kubenswrapper[4717]: I0221 22:46:12.795616 4717 generic.go:334] "Generic (PLEG): container finished" podID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerID="3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d" exitCode=0 Feb 21 22:46:12 crc kubenswrapper[4717]: I0221 22:46:12.795720 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerDied","Data":"3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d"} Feb 21 22:46:12 crc kubenswrapper[4717]: I0221 22:46:12.796161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerStarted","Data":"f08c7211b09e5e1a60de2a8c052fd20467aa066af2a8b58977f685daa806862f"} Feb 21 22:46:12 crc kubenswrapper[4717]: I0221 22:46:12.810670 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:46:13 crc kubenswrapper[4717]: I0221 22:46:13.807804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerStarted","Data":"83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d"} Feb 21 22:46:14 crc kubenswrapper[4717]: I0221 22:46:14.823642 4717 generic.go:334] "Generic (PLEG): container finished" podID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerID="83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d" exitCode=0 Feb 21 22:46:14 crc kubenswrapper[4717]: I0221 22:46:14.823724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerDied","Data":"83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d"} Feb 21 22:46:14 crc kubenswrapper[4717]: I0221 22:46:14.977703 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:46:14 crc kubenswrapper[4717]: E0221 22:46:14.977962 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:46:15 crc kubenswrapper[4717]: I0221 22:46:15.838270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerStarted","Data":"1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948"} Feb 21 22:46:15 crc kubenswrapper[4717]: I0221 22:46:15.876906 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nn47k" podStartSLOduration=3.3650785389999998 podStartE2EDuration="5.876881546s" podCreationTimestamp="2026-02-21 22:46:10 +0000 UTC" firstStartedPulling="2026-02-21 22:46:12.810231941 +0000 UTC m=+3587.591765593" lastFinishedPulling="2026-02-21 22:46:15.322034978 +0000 UTC m=+3590.103568600" observedRunningTime="2026-02-21 22:46:15.865528895 +0000 UTC m=+3590.647062557" watchObservedRunningTime="2026-02-21 22:46:15.876881546 +0000 UTC m=+3590.658415178" Feb 21 22:46:16 crc kubenswrapper[4717]: I0221 22:46:16.855081 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerID="7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d" exitCode=0 Feb 21 22:46:16 crc kubenswrapper[4717]: I0221 22:46:16.855197 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f2s4m/must-gather-956j5" event={"ID":"d1878cb1-51e8-4916-9bff-b056af0bb210","Type":"ContainerDied","Data":"7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d"} Feb 21 22:46:16 crc kubenswrapper[4717]: I0221 22:46:16.856415 4717 scope.go:117] "RemoveContainer" containerID="7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d" Feb 21 22:46:17 crc kubenswrapper[4717]: I0221 22:46:17.564643 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f2s4m_must-gather-956j5_d1878cb1-51e8-4916-9bff-b056af0bb210/gather/0.log" Feb 21 22:46:21 crc kubenswrapper[4717]: I0221 22:46:21.303369 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:21 crc kubenswrapper[4717]: I0221 22:46:21.304669 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:21 crc kubenswrapper[4717]: I0221 22:46:21.365437 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:22 crc kubenswrapper[4717]: I0221 22:46:22.014080 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:22 crc kubenswrapper[4717]: I0221 22:46:22.075351 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nn47k"] Feb 21 22:46:23 crc kubenswrapper[4717]: I0221 22:46:23.938411 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nn47k" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="registry-server" containerID="cri-o://1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948" gracePeriod=2 Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.446934 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.485285 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-catalog-content\") pod \"15716053-ad9e-433e-9e65-e2e2cfa61cad\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.485461 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-utilities\") pod \"15716053-ad9e-433e-9e65-e2e2cfa61cad\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.485676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpjrm\" (UniqueName: \"kubernetes.io/projected/15716053-ad9e-433e-9e65-e2e2cfa61cad-kube-api-access-tpjrm\") pod \"15716053-ad9e-433e-9e65-e2e2cfa61cad\" (UID: \"15716053-ad9e-433e-9e65-e2e2cfa61cad\") " Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.487110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-utilities" (OuterVolumeSpecName: "utilities") pod "15716053-ad9e-433e-9e65-e2e2cfa61cad" (UID: "15716053-ad9e-433e-9e65-e2e2cfa61cad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.491700 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.497599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15716053-ad9e-433e-9e65-e2e2cfa61cad-kube-api-access-tpjrm" (OuterVolumeSpecName: "kube-api-access-tpjrm") pod "15716053-ad9e-433e-9e65-e2e2cfa61cad" (UID: "15716053-ad9e-433e-9e65-e2e2cfa61cad"). InnerVolumeSpecName "kube-api-access-tpjrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.552183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15716053-ad9e-433e-9e65-e2e2cfa61cad" (UID: "15716053-ad9e-433e-9e65-e2e2cfa61cad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.593165 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpjrm\" (UniqueName: \"kubernetes.io/projected/15716053-ad9e-433e-9e65-e2e2cfa61cad-kube-api-access-tpjrm\") on node \"crc\" DevicePath \"\"" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.593223 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15716053-ad9e-433e-9e65-e2e2cfa61cad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.956128 4717 generic.go:334] "Generic (PLEG): container finished" podID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerID="1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948" exitCode=0 Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.956208 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nn47k" Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.956221 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerDied","Data":"1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948"} Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.956302 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nn47k" event={"ID":"15716053-ad9e-433e-9e65-e2e2cfa61cad","Type":"ContainerDied","Data":"f08c7211b09e5e1a60de2a8c052fd20467aa066af2a8b58977f685daa806862f"} Feb 21 22:46:24 crc kubenswrapper[4717]: I0221 22:46:24.956335 4717 scope.go:117] "RemoveContainer" containerID="1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.003295 4717 scope.go:117] "RemoveContainer" containerID="83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.021745 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nn47k"] Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.041149 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nn47k"] Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.046534 4717 scope.go:117] "RemoveContainer" containerID="3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.091354 4717 scope.go:117] "RemoveContainer" containerID="1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948" Feb 21 22:46:25 crc kubenswrapper[4717]: E0221 22:46:25.092055 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948\": container with ID starting with 1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948 not found: ID does not exist" containerID="1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.092113 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948"} err="failed to get container status \"1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948\": rpc error: code = NotFound desc = could not find container \"1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948\": container with ID starting with 1267ce5b80e4876234517ea322def8c26baa820e4ad7712399da258cf8102948 not found: ID does not exist" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.092153 4717 scope.go:117] "RemoveContainer" containerID="83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d" Feb 21 22:46:25 crc kubenswrapper[4717]: E0221 22:46:25.092750 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d\": container with ID starting with 83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d not found: ID does not exist" containerID="83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.092789 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d"} err="failed to get container status \"83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d\": rpc error: code = NotFound desc = could not find container \"83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d\": container with ID starting with 83eac325e5ae214e175fd8b7514027e173cc1aae5009a82c276965cfb54c405d not found: ID does not exist" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.092809 4717 scope.go:117] "RemoveContainer" containerID="3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d" Feb 21 22:46:25 crc kubenswrapper[4717]: E0221 22:46:25.093345 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d\": container with ID starting with 3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d not found: ID does not exist" containerID="3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.093395 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d"} err="failed to get container status \"3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d\": rpc error: code = NotFound desc = could not find container \"3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d\": container with ID starting with 3ffc19e0999a3c0642f8f70644b063217c893b855d4f5e535e4b2a14f56c120d not found: ID does not exist" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.114186 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f2s4m/must-gather-956j5"] Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.114601 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f2s4m/must-gather-956j5" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="copy" containerID="cri-o://a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f" gracePeriod=2 Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.125941 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f2s4m/must-gather-956j5"] Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.550232 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f2s4m_must-gather-956j5_d1878cb1-51e8-4916-9bff-b056af0bb210/copy/0.log" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.551529 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.637143 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjblh\" (UniqueName: \"kubernetes.io/projected/d1878cb1-51e8-4916-9bff-b056af0bb210-kube-api-access-gjblh\") pod \"d1878cb1-51e8-4916-9bff-b056af0bb210\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.637263 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1878cb1-51e8-4916-9bff-b056af0bb210-must-gather-output\") pod \"d1878cb1-51e8-4916-9bff-b056af0bb210\" (UID: \"d1878cb1-51e8-4916-9bff-b056af0bb210\") " Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.643101 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1878cb1-51e8-4916-9bff-b056af0bb210-kube-api-access-gjblh" (OuterVolumeSpecName: "kube-api-access-gjblh") pod "d1878cb1-51e8-4916-9bff-b056af0bb210" (UID: "d1878cb1-51e8-4916-9bff-b056af0bb210"). InnerVolumeSpecName "kube-api-access-gjblh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.740125 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjblh\" (UniqueName: \"kubernetes.io/projected/d1878cb1-51e8-4916-9bff-b056af0bb210-kube-api-access-gjblh\") on node \"crc\" DevicePath \"\"" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.796815 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1878cb1-51e8-4916-9bff-b056af0bb210-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d1878cb1-51e8-4916-9bff-b056af0bb210" (UID: "d1878cb1-51e8-4916-9bff-b056af0bb210"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.842845 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1878cb1-51e8-4916-9bff-b056af0bb210-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.967027 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f2s4m_must-gather-956j5_d1878cb1-51e8-4916-9bff-b056af0bb210/copy/0.log" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.967513 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerID="a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f" exitCode=143 Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.967600 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f2s4m/must-gather-956j5" Feb 21 22:46:25 crc kubenswrapper[4717]: I0221 22:46:25.967602 4717 scope.go:117] "RemoveContainer" containerID="a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.013547 4717 scope.go:117] "RemoveContainer" containerID="7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.013586 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" path="/var/lib/kubelet/pods/15716053-ad9e-433e-9e65-e2e2cfa61cad/volumes" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.015330 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" path="/var/lib/kubelet/pods/d1878cb1-51e8-4916-9bff-b056af0bb210/volumes" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.089368 4717 scope.go:117] "RemoveContainer" containerID="a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f" Feb 21 22:46:26 crc kubenswrapper[4717]: E0221 22:46:26.089831 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f\": container with ID starting with a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f not found: ID does not exist" containerID="a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.089962 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f"} err="failed to get container status \"a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f\": rpc error: code = NotFound desc = could not find container \"a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f\": container with ID starting with a5269b69fc2dadf2644def89e7db82076eadcd226172bb4d4f925e79b440a83f not found: ID does not exist" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.090050 4717 scope.go:117] "RemoveContainer" containerID="7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d" Feb 21 22:46:26 crc kubenswrapper[4717]: E0221 22:46:26.091290 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d\": container with ID starting with 7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d not found: ID does not exist" containerID="7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d" Feb 21 22:46:26 crc kubenswrapper[4717]: I0221 22:46:26.091362 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d"} err="failed to get container status \"7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d\": rpc error: code = NotFound desc = could not find container \"7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d\": container with ID starting with 7390d0790f8fdf2ed58e43dac81e694d522ffb29ec34dc2268ba944818187a6d not found: ID does not exist" Feb 21 22:46:29 crc kubenswrapper[4717]: I0221 22:46:29.980069 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:46:29 crc kubenswrapper[4717]: E0221 22:46:29.980789 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:46:43 crc kubenswrapper[4717]: I0221 22:46:43.977587 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:46:43 crc kubenswrapper[4717]: E0221 22:46:43.978927 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:46:56 crc kubenswrapper[4717]: I0221 22:46:56.977181 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:46:56 crc kubenswrapper[4717]: E0221 22:46:56.978394 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:47:07 crc kubenswrapper[4717]: I0221 22:47:07.976506 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:47:07 crc kubenswrapper[4717]: E0221 22:47:07.977581 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:47:21 crc kubenswrapper[4717]: I0221 22:47:21.978084 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:47:21 crc kubenswrapper[4717]: E0221 22:47:21.978916 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:47:35 crc kubenswrapper[4717]: I0221 22:47:35.992370 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:47:35 crc kubenswrapper[4717]: E0221 22:47:35.993525 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:47:46 crc kubenswrapper[4717]: I0221 22:47:46.976613 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:47:46 crc kubenswrapper[4717]: E0221 22:47:46.977471 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:48:01 crc kubenswrapper[4717]: I0221 22:48:01.978516 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:48:01 crc kubenswrapper[4717]: E0221 22:48:01.979755 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:48:16 crc kubenswrapper[4717]: I0221 22:48:16.976895 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:48:16 crc kubenswrapper[4717]: E0221 22:48:16.977925 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:48:27 crc kubenswrapper[4717]: I0221 22:48:27.976716 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:48:27 crc kubenswrapper[4717]: E0221 22:48:27.977825 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:48:38 crc kubenswrapper[4717]: I0221 22:48:38.978152 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:48:38 crc kubenswrapper[4717]: E0221 22:48:38.978935 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:48:50 crc kubenswrapper[4717]: I0221 22:48:50.978910 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:48:50 crc kubenswrapper[4717]: E0221 22:48:50.979950 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:49:02 crc kubenswrapper[4717]: I0221 22:49:02.976354 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:49:02 crc kubenswrapper[4717]: E0221 22:49:02.977158 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.753624 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-987jj/must-gather-x2sgd"] Feb 21 22:49:11 crc kubenswrapper[4717]: E0221 22:49:11.754673 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="registry-server" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.754690 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="registry-server" Feb 21 22:49:11 crc kubenswrapper[4717]: E0221 22:49:11.754717 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="extract-utilities" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.754727 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="extract-utilities" Feb 21 22:49:11 crc kubenswrapper[4717]: E0221 22:49:11.754742 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="extract-content" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.754749 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="extract-content" Feb 21 22:49:11 crc kubenswrapper[4717]: E0221 22:49:11.754765 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="gather" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.754772 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="gather" Feb 21 22:49:11 crc kubenswrapper[4717]: E0221 22:49:11.754799 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="copy" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.754805 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="copy" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.755037 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="copy" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.755051 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="15716053-ad9e-433e-9e65-e2e2cfa61cad" containerName="registry-server" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.755069 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1878cb1-51e8-4916-9bff-b056af0bb210" containerName="gather" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.756178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.758229 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-987jj"/"default-dockercfg-hw225" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.759358 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-987jj"/"kube-root-ca.crt" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.759418 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-987jj"/"openshift-service-ca.crt" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.765086 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-987jj/must-gather-x2sgd"] Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.901957 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4ca7e21-366d-48a0-85c5-fae13f9026d9-must-gather-output\") pod \"must-gather-x2sgd\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:11 crc kubenswrapper[4717]: I0221 22:49:11.902079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmrz\" (UniqueName: \"kubernetes.io/projected/c4ca7e21-366d-48a0-85c5-fae13f9026d9-kube-api-access-jqmrz\") pod \"must-gather-x2sgd\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:12 crc kubenswrapper[4717]: I0221 22:49:12.003360 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmrz\" (UniqueName: \"kubernetes.io/projected/c4ca7e21-366d-48a0-85c5-fae13f9026d9-kube-api-access-jqmrz\") pod \"must-gather-x2sgd\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:12 crc kubenswrapper[4717]: I0221 22:49:12.003501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4ca7e21-366d-48a0-85c5-fae13f9026d9-must-gather-output\") pod \"must-gather-x2sgd\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:12 crc kubenswrapper[4717]: I0221 22:49:12.003946 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4ca7e21-366d-48a0-85c5-fae13f9026d9-must-gather-output\") pod \"must-gather-x2sgd\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:12 crc kubenswrapper[4717]: I0221 22:49:12.022191 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmrz\" (UniqueName: \"kubernetes.io/projected/c4ca7e21-366d-48a0-85c5-fae13f9026d9-kube-api-access-jqmrz\") pod \"must-gather-x2sgd\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:12 crc kubenswrapper[4717]: I0221 22:49:12.079573 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:49:13 crc kubenswrapper[4717]: I0221 22:49:12.622773 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-987jj/must-gather-x2sgd"] Feb 21 22:49:13 crc kubenswrapper[4717]: I0221 22:49:13.094002 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/must-gather-x2sgd" event={"ID":"c4ca7e21-366d-48a0-85c5-fae13f9026d9","Type":"ContainerStarted","Data":"3c403ddffa24a287cd6beaab0a626a78ef33b6335cadfaf128af91fb15f65c6f"} Feb 21 22:49:13 crc kubenswrapper[4717]: I0221 22:49:13.094254 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/must-gather-x2sgd" event={"ID":"c4ca7e21-366d-48a0-85c5-fae13f9026d9","Type":"ContainerStarted","Data":"702ebee2ba968b5cafdf752197b26902310784554404162e091662f65f6a8910"} Feb 21 22:49:14 crc kubenswrapper[4717]: I0221 22:49:14.108504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/must-gather-x2sgd" event={"ID":"c4ca7e21-366d-48a0-85c5-fae13f9026d9","Type":"ContainerStarted","Data":"6fe2d14c4b886ecd9edcb94bd6a903f5dbb1aa488bb53478950a35183d87a347"} Feb 21 22:49:14 crc kubenswrapper[4717]: I0221 22:49:14.139276 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-987jj/must-gather-x2sgd" podStartSLOduration=3.139253952 podStartE2EDuration="3.139253952s" podCreationTimestamp="2026-02-21 22:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:49:14.129391406 +0000 UTC m=+3768.910925058" watchObservedRunningTime="2026-02-21 22:49:14.139253952 +0000 UTC m=+3768.920787584" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.565279 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-987jj/crc-debug-crpfd"] Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.566568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.691984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsx4w\" (UniqueName: \"kubernetes.io/projected/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-kube-api-access-vsx4w\") pod \"crc-debug-crpfd\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.692382 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-host\") pod \"crc-debug-crpfd\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.793850 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsx4w\" (UniqueName: \"kubernetes.io/projected/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-kube-api-access-vsx4w\") pod \"crc-debug-crpfd\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.793919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-host\") pod \"crc-debug-crpfd\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.794084 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-host\") pod \"crc-debug-crpfd\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.820704 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsx4w\" (UniqueName: \"kubernetes.io/projected/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-kube-api-access-vsx4w\") pod \"crc-debug-crpfd\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: I0221 22:49:16.892255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:16 crc kubenswrapper[4717]: W0221 22:49:16.923384 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e722f9d_aba6_4c9d_b2e2_908121c15f9f.slice/crio-e8edb460e4058ea8c87736f5bc9b713a2e76dbb1072cbc8c35cb606a09597f61 WatchSource:0}: Error finding container e8edb460e4058ea8c87736f5bc9b713a2e76dbb1072cbc8c35cb606a09597f61: Status 404 returned error can't find the container with id e8edb460e4058ea8c87736f5bc9b713a2e76dbb1072cbc8c35cb606a09597f61 Feb 21 22:49:17 crc kubenswrapper[4717]: I0221 22:49:17.133632 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-crpfd" event={"ID":"4e722f9d-aba6-4c9d-b2e2-908121c15f9f","Type":"ContainerStarted","Data":"e8edb460e4058ea8c87736f5bc9b713a2e76dbb1072cbc8c35cb606a09597f61"} Feb 21 22:49:17 crc kubenswrapper[4717]: I0221 22:49:17.976418 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:49:17 crc kubenswrapper[4717]: E0221 22:49:17.977814 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:49:18 crc kubenswrapper[4717]: I0221 22:49:18.142422 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-crpfd" event={"ID":"4e722f9d-aba6-4c9d-b2e2-908121c15f9f","Type":"ContainerStarted","Data":"c3ee99772047ead465663eba81a7a3eaf8eedc6c0c8ef21f30401adbe3d87357"} Feb 21 22:49:18 crc kubenswrapper[4717]: I0221 22:49:18.167906 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-987jj/crc-debug-crpfd" podStartSLOduration=2.1678877229999998 podStartE2EDuration="2.167887723s" podCreationTimestamp="2026-02-21 22:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-21 22:49:18.160294131 +0000 UTC m=+3772.941827763" watchObservedRunningTime="2026-02-21 22:49:18.167887723 +0000 UTC m=+3772.949421355" Feb 21 22:49:31 crc kubenswrapper[4717]: I0221 22:49:31.976416 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:49:31 crc kubenswrapper[4717]: E0221 22:49:31.977433 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:49:42 crc kubenswrapper[4717]: I0221 22:49:42.977545 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:49:42 crc kubenswrapper[4717]: E0221 22:49:42.978545 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:49:49 crc kubenswrapper[4717]: I0221 22:49:49.411924 4717 generic.go:334] "Generic (PLEG): container finished" podID="4e722f9d-aba6-4c9d-b2e2-908121c15f9f" containerID="c3ee99772047ead465663eba81a7a3eaf8eedc6c0c8ef21f30401adbe3d87357" exitCode=0 Feb 21 22:49:49 crc kubenswrapper[4717]: I0221 22:49:49.412015 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-crpfd" event={"ID":"4e722f9d-aba6-4c9d-b2e2-908121c15f9f","Type":"ContainerDied","Data":"c3ee99772047ead465663eba81a7a3eaf8eedc6c0c8ef21f30401adbe3d87357"} Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.548697 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.579952 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-987jj/crc-debug-crpfd"] Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.587432 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-987jj/crc-debug-crpfd"] Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.730912 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-host\") pod \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.731074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-host" (OuterVolumeSpecName: "host") pod "4e722f9d-aba6-4c9d-b2e2-908121c15f9f" (UID: "4e722f9d-aba6-4c9d-b2e2-908121c15f9f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.731304 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsx4w\" (UniqueName: \"kubernetes.io/projected/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-kube-api-access-vsx4w\") pod \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\" (UID: \"4e722f9d-aba6-4c9d-b2e2-908121c15f9f\") " Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.731766 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-host\") on node \"crc\" DevicePath \"\"" Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.746033 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-kube-api-access-vsx4w" (OuterVolumeSpecName: "kube-api-access-vsx4w") pod "4e722f9d-aba6-4c9d-b2e2-908121c15f9f" (UID: "4e722f9d-aba6-4c9d-b2e2-908121c15f9f"). InnerVolumeSpecName "kube-api-access-vsx4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:49:50 crc kubenswrapper[4717]: I0221 22:49:50.833659 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsx4w\" (UniqueName: \"kubernetes.io/projected/4e722f9d-aba6-4c9d-b2e2-908121c15f9f-kube-api-access-vsx4w\") on node \"crc\" DevicePath \"\"" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.441955 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8edb460e4058ea8c87736f5bc9b713a2e76dbb1072cbc8c35cb606a09597f61" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.442053 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-crpfd" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.781465 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-987jj/crc-debug-rzgvc"] Feb 21 22:49:51 crc kubenswrapper[4717]: E0221 22:49:51.781838 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e722f9d-aba6-4c9d-b2e2-908121c15f9f" containerName="container-00" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.781850 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e722f9d-aba6-4c9d-b2e2-908121c15f9f" containerName="container-00" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.782117 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e722f9d-aba6-4c9d-b2e2-908121c15f9f" containerName="container-00" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.782753 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.955931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9488\" (UniqueName: \"kubernetes.io/projected/2f93c787-804d-4b4c-b9a9-0602390790ea-kube-api-access-q9488\") pod \"crc-debug-rzgvc\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.956072 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f93c787-804d-4b4c-b9a9-0602390790ea-host\") pod \"crc-debug-rzgvc\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:51 crc kubenswrapper[4717]: I0221 22:49:51.990978 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e722f9d-aba6-4c9d-b2e2-908121c15f9f" path="/var/lib/kubelet/pods/4e722f9d-aba6-4c9d-b2e2-908121c15f9f/volumes" Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.058132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9488\" (UniqueName: \"kubernetes.io/projected/2f93c787-804d-4b4c-b9a9-0602390790ea-kube-api-access-q9488\") pod \"crc-debug-rzgvc\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.058240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f93c787-804d-4b4c-b9a9-0602390790ea-host\") pod \"crc-debug-rzgvc\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.058325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f93c787-804d-4b4c-b9a9-0602390790ea-host\") pod \"crc-debug-rzgvc\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.088396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9488\" (UniqueName: \"kubernetes.io/projected/2f93c787-804d-4b4c-b9a9-0602390790ea-kube-api-access-q9488\") pod \"crc-debug-rzgvc\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.098488 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.452671 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f93c787-804d-4b4c-b9a9-0602390790ea" containerID="815fd62c7df9ed18584b44d0e9debaaab0207491e591b0016407f5855b2e0e7d" exitCode=0 Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.452718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-rzgvc" event={"ID":"2f93c787-804d-4b4c-b9a9-0602390790ea","Type":"ContainerDied","Data":"815fd62c7df9ed18584b44d0e9debaaab0207491e591b0016407f5855b2e0e7d"} Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.453171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-rzgvc" event={"ID":"2f93c787-804d-4b4c-b9a9-0602390790ea","Type":"ContainerStarted","Data":"908c5b3941a01669f615310b6c0270b36af5032efdb9f47660cfa12bccaa466d"} Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.853267 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-987jj/crc-debug-rzgvc"] Feb 21 22:49:52 crc kubenswrapper[4717]: I0221 22:49:52.861297 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-987jj/crc-debug-rzgvc"] Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.576758 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.684936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9488\" (UniqueName: \"kubernetes.io/projected/2f93c787-804d-4b4c-b9a9-0602390790ea-kube-api-access-q9488\") pod \"2f93c787-804d-4b4c-b9a9-0602390790ea\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.684975 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f93c787-804d-4b4c-b9a9-0602390790ea-host\") pod \"2f93c787-804d-4b4c-b9a9-0602390790ea\" (UID: \"2f93c787-804d-4b4c-b9a9-0602390790ea\") " Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.685205 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f93c787-804d-4b4c-b9a9-0602390790ea-host" (OuterVolumeSpecName: "host") pod "2f93c787-804d-4b4c-b9a9-0602390790ea" (UID: "2f93c787-804d-4b4c-b9a9-0602390790ea"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.685531 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f93c787-804d-4b4c-b9a9-0602390790ea-host\") on node \"crc\" DevicePath \"\"" Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.699050 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f93c787-804d-4b4c-b9a9-0602390790ea-kube-api-access-q9488" (OuterVolumeSpecName: "kube-api-access-q9488") pod "2f93c787-804d-4b4c-b9a9-0602390790ea" (UID: "2f93c787-804d-4b4c-b9a9-0602390790ea"). InnerVolumeSpecName "kube-api-access-q9488". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.787773 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9488\" (UniqueName: \"kubernetes.io/projected/2f93c787-804d-4b4c-b9a9-0602390790ea-kube-api-access-q9488\") on node \"crc\" DevicePath \"\"" Feb 21 22:49:53 crc kubenswrapper[4717]: I0221 22:49:53.990307 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f93c787-804d-4b4c-b9a9-0602390790ea" path="/var/lib/kubelet/pods/2f93c787-804d-4b4c-b9a9-0602390790ea/volumes" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.027922 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-987jj/crc-debug-xpfkz"] Feb 21 22:49:54 crc kubenswrapper[4717]: E0221 22:49:54.028402 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f93c787-804d-4b4c-b9a9-0602390790ea" containerName="container-00" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.028421 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f93c787-804d-4b4c-b9a9-0602390790ea" containerName="container-00" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.028652 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f93c787-804d-4b4c-b9a9-0602390790ea" containerName="container-00" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.029434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.195951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79c9c1d7-d921-41d4-9775-3823dc4f5422-host\") pod \"crc-debug-xpfkz\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.196022 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4wmm\" (UniqueName: \"kubernetes.io/projected/79c9c1d7-d921-41d4-9775-3823dc4f5422-kube-api-access-l4wmm\") pod \"crc-debug-xpfkz\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.299851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79c9c1d7-d921-41d4-9775-3823dc4f5422-host\") pod \"crc-debug-xpfkz\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.299976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4wmm\" (UniqueName: \"kubernetes.io/projected/79c9c1d7-d921-41d4-9775-3823dc4f5422-kube-api-access-l4wmm\") pod \"crc-debug-xpfkz\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.300472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79c9c1d7-d921-41d4-9775-3823dc4f5422-host\") pod \"crc-debug-xpfkz\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.317553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4wmm\" (UniqueName: \"kubernetes.io/projected/79c9c1d7-d921-41d4-9775-3823dc4f5422-kube-api-access-l4wmm\") pod \"crc-debug-xpfkz\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.347918 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:54 crc kubenswrapper[4717]: W0221 22:49:54.379278 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c9c1d7_d921_41d4_9775_3823dc4f5422.slice/crio-13b32542752e97355c2affad687bda8d5647a3462892b75298052f5e605615b8 WatchSource:0}: Error finding container 13b32542752e97355c2affad687bda8d5647a3462892b75298052f5e605615b8: Status 404 returned error can't find the container with id 13b32542752e97355c2affad687bda8d5647a3462892b75298052f5e605615b8 Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.473555 4717 scope.go:117] "RemoveContainer" containerID="815fd62c7df9ed18584b44d0e9debaaab0207491e591b0016407f5855b2e0e7d" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.473689 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-rzgvc" Feb 21 22:49:54 crc kubenswrapper[4717]: I0221 22:49:54.478916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-xpfkz" event={"ID":"79c9c1d7-d921-41d4-9775-3823dc4f5422","Type":"ContainerStarted","Data":"13b32542752e97355c2affad687bda8d5647a3462892b75298052f5e605615b8"} Feb 21 22:49:55 crc kubenswrapper[4717]: I0221 22:49:55.488450 4717 generic.go:334] "Generic (PLEG): container finished" podID="79c9c1d7-d921-41d4-9775-3823dc4f5422" containerID="d17540244c7db7889ca5be9daa6376f8b3ef0216fbbd7bce419b30760696894b" exitCode=0 Feb 21 22:49:55 crc kubenswrapper[4717]: I0221 22:49:55.488594 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/crc-debug-xpfkz" event={"ID":"79c9c1d7-d921-41d4-9775-3823dc4f5422","Type":"ContainerDied","Data":"d17540244c7db7889ca5be9daa6376f8b3ef0216fbbd7bce419b30760696894b"} Feb 21 22:49:55 crc kubenswrapper[4717]: I0221 22:49:55.534909 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-987jj/crc-debug-xpfkz"] Feb 21 22:49:55 crc kubenswrapper[4717]: I0221 22:49:55.543853 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-987jj/crc-debug-xpfkz"] Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.598455 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.744629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4wmm\" (UniqueName: \"kubernetes.io/projected/79c9c1d7-d921-41d4-9775-3823dc4f5422-kube-api-access-l4wmm\") pod \"79c9c1d7-d921-41d4-9775-3823dc4f5422\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.744828 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79c9c1d7-d921-41d4-9775-3823dc4f5422-host\") pod \"79c9c1d7-d921-41d4-9775-3823dc4f5422\" (UID: \"79c9c1d7-d921-41d4-9775-3823dc4f5422\") " Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.744965 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79c9c1d7-d921-41d4-9775-3823dc4f5422-host" (OuterVolumeSpecName: "host") pod "79c9c1d7-d921-41d4-9775-3823dc4f5422" (UID: "79c9c1d7-d921-41d4-9775-3823dc4f5422"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.745430 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79c9c1d7-d921-41d4-9775-3823dc4f5422-host\") on node \"crc\" DevicePath \"\"" Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.751831 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c9c1d7-d921-41d4-9775-3823dc4f5422-kube-api-access-l4wmm" (OuterVolumeSpecName: "kube-api-access-l4wmm") pod "79c9c1d7-d921-41d4-9775-3823dc4f5422" (UID: "79c9c1d7-d921-41d4-9775-3823dc4f5422"). InnerVolumeSpecName "kube-api-access-l4wmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:49:56 crc kubenswrapper[4717]: I0221 22:49:56.847966 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4wmm\" (UniqueName: \"kubernetes.io/projected/79c9c1d7-d921-41d4-9775-3823dc4f5422-kube-api-access-l4wmm\") on node \"crc\" DevicePath \"\"" Feb 21 22:49:57 crc kubenswrapper[4717]: I0221 22:49:57.513531 4717 scope.go:117] "RemoveContainer" containerID="d17540244c7db7889ca5be9daa6376f8b3ef0216fbbd7bce419b30760696894b" Feb 21 22:49:57 crc kubenswrapper[4717]: I0221 22:49:57.513592 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/crc-debug-xpfkz" Feb 21 22:49:57 crc kubenswrapper[4717]: I0221 22:49:57.981596 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:49:57 crc kubenswrapper[4717]: E0221 22:49:57.982008 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:49:57 crc kubenswrapper[4717]: I0221 22:49:57.991464 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c9c1d7-d921-41d4-9775-3823dc4f5422" path="/var/lib/kubelet/pods/79c9c1d7-d921-41d4-9775-3823dc4f5422/volumes" Feb 21 22:50:08 crc kubenswrapper[4717]: I0221 22:50:08.977072 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:50:08 crc kubenswrapper[4717]: E0221 22:50:08.978462 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:50:20 crc kubenswrapper[4717]: I0221 22:50:20.976221 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:50:20 crc kubenswrapper[4717]: E0221 22:50:20.976939 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:50:29 crc kubenswrapper[4717]: I0221 22:50:29.884828 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-df9cd9b6-x9br7_ace10c13-1b1b-4e3a-8f58-b8dab0c80704/barbican-api/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.028169 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-df9cd9b6-x9br7_ace10c13-1b1b-4e3a-8f58-b8dab0c80704/barbican-api-log/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.067185 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6bbdbc7546-gvgtj_71cac7a0-f790-43e5-87f9-d3862c20f857/barbican-keystone-listener/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.098472 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6bbdbc7546-gvgtj_71cac7a0-f790-43e5-87f9-d3862c20f857/barbican-keystone-listener-log/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.264626 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6fbd665b5-2sdwf_30134acb-a272-4da8-a2b6-683e431f593e/barbican-worker-log/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.267951 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6fbd665b5-2sdwf_30134acb-a272-4da8-a2b6-683e431f593e/barbican-worker/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.479256 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-twq55_da6e1269-a5c6-4f39-8d0a-b544de9522ba/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.488006 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/ceilometer-central-agent/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.608687 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/ceilometer-notification-agent/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.647132 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/proxy-httpd/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.657143 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_08a3f1a6-792f-4679-84f0-55795cb63990/sg-core/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.859782 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_db0133ac-ab76-4b9d-a3e9-e55a095f919a/cinder-api/0.log" Feb 21 22:50:30 crc kubenswrapper[4717]: I0221 22:50:30.867558 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_db0133ac-ab76-4b9d-a3e9-e55a095f919a/cinder-api-log/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.030904 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9955b361-63c8-42bb-9efc-7ab0b3150904/cinder-scheduler/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.103669 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9955b361-63c8-42bb-9efc-7ab0b3150904/probe/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.132423 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7cxwk_79339bc9-6d8a-4fe5-ba8d-37643afe6d98/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.337934 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-q56p6_ba43f982-ee7f-4e48-a144-0c6d5c54c5a1/init/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.362350 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6hp66_ce64ba17-432c-46d7-86f2-33cc62514604/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.509209 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-q56p6_ba43f982-ee7f-4e48-a144-0c6d5c54c5a1/init/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.545777 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-q56p6_ba43f982-ee7f-4e48-a144-0c6d5c54c5a1/dnsmasq-dns/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.579484 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lw9lm_674f8569-62c3-477e-85af-13befe292f49/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.769957 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a8060c80-2f4b-4099-b5fd-841fadcdb329/glance-log/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.785646 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a8060c80-2f4b-4099-b5fd-841fadcdb329/glance-httpd/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.919960 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_128a5f99-d2a9-4551-8fd0-45efc6017dab/glance-httpd/0.log" Feb 21 22:50:31 crc kubenswrapper[4717]: I0221 22:50:31.959668 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_128a5f99-d2a9-4551-8fd0-45efc6017dab/glance-log/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.151907 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86b8dffbf6-mztpd_4e230efb-55a4-4e7f-9d9a-cc61d3123eab/horizon/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.250257 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j6gml_6cddc12c-e9af-45b5-a4cc-c9c7f6d5f5f4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.418990 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ml46s_0fcb6379-1d8c-44c9-8e50-10c52e40abcd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.483526 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86b8dffbf6-mztpd_4e230efb-55a4-4e7f-9d9a-cc61d3123eab/horizon-log/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.710583 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3f8bbbe9-6ba9-48a4-8ade-3e41ee668a2b/kube-state-metrics/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.745452 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-86d46bb596-pj8cr_110e5c1e-4f14-4ab1-a0e0-f54dec9095a2/keystone-api/0.log" Feb 21 22:50:32 crc kubenswrapper[4717]: I0221 22:50:32.860238 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xpj84_4dac73d5-a471-44a7-a91a-3422e09c7bb0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:33 crc kubenswrapper[4717]: I0221 22:50:33.194308 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-677cdf8c9f-j2vl7_5aad4222-11d4-4bb0-9a90-b9924339c70e/neutron-httpd/0.log" Feb 21 22:50:33 crc kubenswrapper[4717]: I0221 22:50:33.207054 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-677cdf8c9f-j2vl7_5aad4222-11d4-4bb0-9a90-b9924339c70e/neutron-api/0.log" Feb 21 22:50:33 crc kubenswrapper[4717]: I0221 22:50:33.370847 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2hpx_788c0818-654d-4001-a3ab-06c9dbd10592/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:33 crc kubenswrapper[4717]: I0221 22:50:33.912078 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3b045abf-1d97-45e2-a8ed-ed13aedc19f7/nova-api-log/0.log" Feb 21 22:50:33 crc kubenswrapper[4717]: I0221 22:50:33.971998 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1030ba56-81a8-4d0d-8e3d-c17779adcac6/nova-cell0-conductor-conductor/0.log" Feb 21 22:50:33 crc kubenswrapper[4717]: I0221 22:50:33.975848 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:50:33 crc kubenswrapper[4717]: E0221 22:50:33.976075 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.097261 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3b045abf-1d97-45e2-a8ed-ed13aedc19f7/nova-api-api/0.log" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.303911 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5e40eb10-e007-4fc0-97ff-de455effb430/nova-cell1-conductor-conductor/0.log" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.329808 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7a7413e9-1833-491d-8d42-0ac926edea33/nova-cell1-novncproxy-novncproxy/0.log" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.397805 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7l5ng_0fa98805-0ef5-463d-9ae3-1a66efcb9b0c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.655730 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e8a58dd-e035-414e-b133-160ea01477aa/nova-metadata-log/0.log" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.891165 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4a73e4bf-8575-43f8-bfff-35b8ca593732/mysql-bootstrap/0.log" Feb 21 22:50:34 crc kubenswrapper[4717]: I0221 22:50:34.935051 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_14550721-34cc-41f8-a5f0-e15e73cf2983/nova-scheduler-scheduler/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.028735 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4a73e4bf-8575-43f8-bfff-35b8ca593732/mysql-bootstrap/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.073909 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4a73e4bf-8575-43f8-bfff-35b8ca593732/galera/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.237414 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e49285f0-879f-40db-8eb9-2e8e18a87bb7/mysql-bootstrap/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.455774 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e49285f0-879f-40db-8eb9-2e8e18a87bb7/mysql-bootstrap/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.463000 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e49285f0-879f-40db-8eb9-2e8e18a87bb7/galera/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.553351 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e8a58dd-e035-414e-b133-160ea01477aa/nova-metadata-metadata/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.677963 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_756781b7-938f-4654-8109-725420287d7b/openstackclient/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.710974 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-828xd_4ca26e0c-e312-4c2c-9e4f-89d8b15b81a1/ovn-controller/0.log" Feb 21 22:50:35 crc kubenswrapper[4717]: I0221 22:50:35.880630 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kllk6_d770b9d0-2378-4b10-bf5a-b7e91c6b3843/openstack-network-exporter/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.019045 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovsdb-server-init/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.246693 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovsdb-server-init/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.248120 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovs-vswitchd/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.264616 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sls6z_51572320-b28e-45be-ba55-524f9e0ccc61/ovsdb-server/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.431618 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_34cc3509-6f63-43c2-86a3-284360464284/openstack-network-exporter/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.454822 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dvvlx_cb09f299-8779-421d-a58f-dd16db2daadb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.494729 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_34cc3509-6f63-43c2-86a3-284360464284/ovn-northd/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.691128 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dcfdad72-66d7-4087-b0c3-4cb1925565a1/openstack-network-exporter/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.737066 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dcfdad72-66d7-4087-b0c3-4cb1925565a1/ovsdbserver-nb/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.871136 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_75f38ef9-3fc9-428a-8364-96c3938d69e5/openstack-network-exporter/0.log" Feb 21 22:50:36 crc kubenswrapper[4717]: I0221 22:50:36.956580 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_75f38ef9-3fc9-428a-8364-96c3938d69e5/ovsdbserver-sb/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.152252 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f5bdf5f76-9rjst_131cec55-efe0-49f9-ad5e-cfbca687c941/placement-api/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.175171 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f5bdf5f76-9rjst_131cec55-efe0-49f9-ad5e-cfbca687c941/placement-log/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.175279 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5547796-04e1-40e3-aa4a-a1aa936efcda/setup-container/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.398071 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5547796-04e1-40e3-aa4a-a1aa936efcda/setup-container/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.443672 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5547796-04e1-40e3-aa4a-a1aa936efcda/rabbitmq/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.449251 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38ce8ac2-d776-449b-89d8-3e9a853a8f44/setup-container/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.603885 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38ce8ac2-d776-449b-89d8-3e9a853a8f44/setup-container/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.660532 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fxwq5_849e3b78-1693-47ad-9fc7-63c7a188d53e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.681307 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38ce8ac2-d776-449b-89d8-3e9a853a8f44/rabbitmq/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.848058 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tqqzt_49b31634-aebd-4b3a-a1a3-f7d3e06782cf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:37 crc kubenswrapper[4717]: I0221 22:50:37.938071 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6gcgg_331acbed-5028-4fdb-84a4-b105805863b9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.114658 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pslvp_d1311e18-5b26-49a3-86e3-481b3f9e5b03/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.173109 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gshck_dee1c8dd-766e-41de-8631-cfc7d23a7681/ssh-known-hosts-edpm-deployment/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.402177 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4dc8df6c-b88lw_b1949ec1-5153-4003-b960-68a8f126b72d/proxy-server/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.465945 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c4dc8df6c-b88lw_b1949ec1-5153-4003-b960-68a8f126b72d/proxy-httpd/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.503962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5zc8b_9d1b5d67-1e8c-4c1f-a6a3-9634827165f8/swift-ring-rebalance/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.688605 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-reaper/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.712559 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-auditor/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.746242 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-replicator/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.820313 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/account-server/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.919707 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-auditor/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.927005 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-replicator/0.log" Feb 21 22:50:38 crc kubenswrapper[4717]: I0221 22:50:38.968705 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-server/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.050597 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/container-updater/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.182172 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-replicator/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.182685 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-auditor/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.197680 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-expirer/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.252309 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-server/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.341410 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/object-updater/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.355927 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/rsync/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.498654 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1fc309c6-44f4-4daf-90fa-6bf6845f195d/swift-recon-cron/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.574305 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-v4ck2_6890db4e-d63d-4b19-87aa-b5186b85ece1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.699407 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4bfb2e0c-d6fd-4a84-850e-6d680fa7ee35/tempest-tests-tempest-tests-runner/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.784905 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_943cf36f-ab88-4c12-a0c6-455facbc74ad/test-operator-logs-container/0.log" Feb 21 22:50:39 crc kubenswrapper[4717]: I0221 22:50:39.940230 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-hs8lx_2eefd54d-a632-4fd6-a45c-e12e1f810d4a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 21 22:50:44 crc kubenswrapper[4717]: I0221 22:50:44.976331 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:50:45 crc kubenswrapper[4717]: I0221 22:50:45.991256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"984daa8c5a04e9002764dbe7ebc03997ead5f21f5959852a7982e44ec2f84886"} Feb 21 22:50:48 crc kubenswrapper[4717]: I0221 22:50:48.422179 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_85a966c3-05cc-49d0-ae99-0c774c67e89d/memcached/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.484205 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/util/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.654718 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/pull/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.669993 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/util/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.684238 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/pull/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.815912 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/util/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.830276 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/pull/0.log" Feb 21 22:51:05 crc kubenswrapper[4717]: I0221 22:51:05.842959 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e2c4ee237d24d45c698cc3c1bc248d94ee822aa6efc33717916111f5d5zsr8_04ed380b-d424-43c9-b15c-384a60a084a0/extract/0.log" Feb 21 22:51:06 crc kubenswrapper[4717]: I0221 22:51:06.202013 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-lgrct_4caf3d32-5fe1-4711-a79f-7ff3b2bee3a6/manager/0.log" Feb 21 22:51:06 crc kubenswrapper[4717]: I0221 22:51:06.467166 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-blpgd_8cb19ba1-4432-41e7-afee-6fccd02f8564/manager/0.log" Feb 21 22:51:06 crc kubenswrapper[4717]: I0221 22:51:06.705015 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-mwvrn_1c89580c-c289-4ca5-b394-a85fa285dc30/manager/0.log" Feb 21 22:51:06 crc kubenswrapper[4717]: I0221 22:51:06.995813 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-qqrnp_5b8a35aa-e7ad-4103-b3db-1011411811db/manager/0.log" Feb 21 22:51:07 crc kubenswrapper[4717]: I0221 22:51:07.235031 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-klhlb_c5916af5-fc6c-4473-aafd-5331043ac1d8/manager/0.log" Feb 21 22:51:07 crc kubenswrapper[4717]: I0221 22:51:07.405514 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-fkch5_85ba4b92-6749-498a-b112-db89d6856988/manager/0.log" Feb 21 22:51:07 crc kubenswrapper[4717]: I0221 22:51:07.542089 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-pb7k5_ff6fe6a4-86ca-4723-915d-b69be63387b6/manager/0.log" Feb 21 22:51:07 crc kubenswrapper[4717]: I0221 22:51:07.622364 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mflb2_adb80d2f-050a-47f9-afe2-46cd5876e640/manager/0.log" Feb 21 22:51:07 crc kubenswrapper[4717]: I0221 22:51:07.744442 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-66ssv_8c916c65-714b-4f8d-b551-c35239deab87/manager/0.log" Feb 21 22:51:07 crc kubenswrapper[4717]: I0221 22:51:07.907036 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-96s5h_825c4fa5-a334-48b9-9ae0-583beb7e6a6b/manager/0.log" Feb 21 22:51:08 crc kubenswrapper[4717]: I0221 22:51:08.071347 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-vzfhh_5baed11e-00fc-4c09-8a82-fb761682244e/manager/0.log" Feb 21 22:51:08 crc kubenswrapper[4717]: I0221 22:51:08.330622 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-8mw2x_a2620c81-a9f0-4d4c-b281-5e5effb23419/manager/0.log" Feb 21 22:51:08 crc kubenswrapper[4717]: I0221 22:51:08.357449 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-7glks_d839bd2c-8b12-4d02-a6b5-0399f3ded9fd/manager/0.log" Feb 21 22:51:08 crc kubenswrapper[4717]: I0221 22:51:08.601996 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9crsgjg_6d86a5a0-240a-4b65-af2b-6a5d91d95744/manager/0.log" Feb 21 22:51:08 crc kubenswrapper[4717]: I0221 22:51:08.895247 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5ccb695f5f-bb64r_29923aad-fe1a-464c-8f19-dc10ef9e4eaa/operator/0.log" Feb 21 22:51:09 crc kubenswrapper[4717]: I0221 22:51:09.113596 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hs5wv_0d953b3d-37a2-403a-bba7-369dc024f173/registry-server/0.log" Feb 21 22:51:09 crc kubenswrapper[4717]: I0221 22:51:09.350610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-gldvl_4b328324-d3f1-4de9-b5b0-fb28bd7dfedd/manager/0.log" Feb 21 22:51:09 crc kubenswrapper[4717]: I0221 22:51:09.480921 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-xs5xq_3204092b-362c-42ed-ab07-3db2d36d32e5/manager/0.log" Feb 21 22:51:09 crc kubenswrapper[4717]: I0221 22:51:09.741409 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-thgdl_8c59a072-f1fc-4ef2-b9c4-f88081ea3a2d/operator/0.log" Feb 21 22:51:09 crc kubenswrapper[4717]: I0221 22:51:09.810397 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-b5bzj_7a130695-4494-482a-b4fb-4703071fd28f/manager/0.log" Feb 21 22:51:10 crc kubenswrapper[4717]: I0221 22:51:10.025078 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-6vmk8_97c089e6-a2dd-4d56-8a8f-9c8d4d8f6f8e/manager/0.log" Feb 21 22:51:10 crc kubenswrapper[4717]: I0221 22:51:10.144505 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-j6mms_a8cafe00-f55d-4444-ae31-827ac956b47c/manager/0.log" Feb 21 22:51:10 crc kubenswrapper[4717]: I0221 22:51:10.296076 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-thwwt_037ce2e5-e940-4172-80f4-f3d738a9d363/manager/0.log" Feb 21 22:51:10 crc kubenswrapper[4717]: I0221 22:51:10.408933 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85dff9d968-589dj_b441ea16-f9a2-4ce5-8902-6bb4bcdc18e8/manager/0.log" Feb 21 22:51:11 crc kubenswrapper[4717]: I0221 22:51:11.922343 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-877lb_14e06d52-8282-4fcd-9cec-6c29a6336057/manager/0.log" Feb 21 22:51:31 crc kubenswrapper[4717]: I0221 22:51:31.149330 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r68hh_ff47caaa-cf37-40ea-8c6c-457189a5432b/control-plane-machine-set-operator/0.log" Feb 21 22:51:31 crc kubenswrapper[4717]: I0221 22:51:31.322047 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-84tzn_56758c2e-648d-41fe-8758-439f0070d150/machine-api-operator/0.log" Feb 21 22:51:31 crc kubenswrapper[4717]: I0221 22:51:31.351103 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-84tzn_56758c2e-648d-41fe-8758-439f0070d150/kube-rbac-proxy/0.log" Feb 21 22:51:44 crc kubenswrapper[4717]: I0221 22:51:44.517884 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tmtcj_36b94939-dd01-40b9-a23b-6408a2cd36e8/cert-manager-controller/0.log" Feb 21 22:51:44 crc kubenswrapper[4717]: I0221 22:51:44.683730 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vkn9j_86b0523d-fba9-48d5-aaa6-33682ae7336a/cert-manager-webhook/0.log" Feb 21 22:51:44 crc kubenswrapper[4717]: I0221 22:51:44.727445 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-pwd5d_11020ca7-c11f-4b18-a5f1-e1ab7bc148d2/cert-manager-cainjector/0.log" Feb 21 22:51:58 crc kubenswrapper[4717]: I0221 22:51:58.124460 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-sdlq7_bef85b90-a8e3-4dbe-b0e7-f57e585bbc15/nmstate-console-plugin/0.log" Feb 21 22:51:58 crc kubenswrapper[4717]: I0221 22:51:58.320057 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wls9d_412a89db-473e-4710-ab4a-ecda68d76787/nmstate-handler/0.log" Feb 21 22:51:59 crc kubenswrapper[4717]: I0221 22:51:59.561659 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-f4sk4_3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1/nmstate-metrics/0.log" Feb 21 22:51:59 crc kubenswrapper[4717]: I0221 22:51:59.569155 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-g8tr4_68767c95-c372-4de6-bab2-8eaae4cb37b3/nmstate-webhook/0.log" Feb 21 22:51:59 crc kubenswrapper[4717]: I0221 22:51:59.586164 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-f4sk4_3e791ecf-1811-4ca1-b08c-5e19bb2ee4e1/kube-rbac-proxy/0.log" Feb 21 22:51:59 crc kubenswrapper[4717]: I0221 22:51:59.589739 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-6z5b8_4074f791-7fb0-4f78-96e7-e926cebfab66/nmstate-operator/0.log" Feb 21 22:52:30 crc kubenswrapper[4717]: I0221 22:52:30.743280 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-gq99r_75d38a7a-9bcd-49d6-812c-6d451c933f87/kube-rbac-proxy/0.log" Feb 21 22:52:30 crc kubenswrapper[4717]: I0221 22:52:30.925778 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-gq99r_75d38a7a-9bcd-49d6-812c-6d451c933f87/controller/0.log" Feb 21 22:52:30 crc kubenswrapper[4717]: I0221 22:52:30.971918 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:52:31 crc kubenswrapper[4717]: I0221 22:52:31.904644 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:52:31 crc kubenswrapper[4717]: I0221 22:52:31.931491 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:52:31 crc kubenswrapper[4717]: I0221 22:52:31.939637 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:52:31 crc kubenswrapper[4717]: I0221 22:52:31.939900 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.097971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.125908 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.126284 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.167029 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.300898 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-frr-files/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.328279 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-reloader/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.352389 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/cp-metrics/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.407492 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/controller/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.534218 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/frr-metrics/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.563195 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/kube-rbac-proxy/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.637483 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/kube-rbac-proxy-frr/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.725262 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/reloader/0.log" Feb 21 22:52:32 crc kubenswrapper[4717]: I0221 22:52:32.857199 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-bklbl_0f79dc1a-f5c3-4b0b-827f-1aeb3729478e/frr-k8s-webhook-server/0.log" Feb 21 22:52:33 crc kubenswrapper[4717]: I0221 22:52:33.014562 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-644bd788d-hj4nv_2c9c10bb-4b07-4f5e-af44-0353e53010d1/manager/0.log" Feb 21 22:52:33 crc kubenswrapper[4717]: I0221 22:52:33.217373 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cdb748cc4-h69fk_d39f5f5c-c8f9-4413-a6a7-abd0aa1beb95/webhook-server/0.log" Feb 21 22:52:33 crc kubenswrapper[4717]: I0221 22:52:33.744444 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9gwt_c0cc3cb1-f7eb-49a4-a537-9bbebf6c3f07/frr/0.log" Feb 21 22:52:33 crc kubenswrapper[4717]: I0221 22:52:33.821898 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r4xwk_1063cb22-b437-4152-913e-9673c0a51b7a/kube-rbac-proxy/0.log" Feb 21 22:52:34 crc kubenswrapper[4717]: I0221 22:52:34.041670 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-r4xwk_1063cb22-b437-4152-913e-9673c0a51b7a/speaker/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.116524 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/util/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.333161 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/pull/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.339341 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/util/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.374265 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/pull/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.533410 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/util/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.533514 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/pull/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.543540 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132bbff_c14c7d98-9c2b-456a-9dc8-0857592681bb/extract/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.700585 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-utilities/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.859921 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-content/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.884204 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-content/0.log" Feb 21 22:52:48 crc kubenswrapper[4717]: I0221 22:52:48.892635 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-utilities/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.039776 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-utilities/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.081544 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/extract-content/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.231449 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-utilities/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.455833 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-utilities/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.490455 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-content/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.536339 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6d6h2_faf0d743-c229-4c65-b34e-8220f1bb1cc1/registry-server/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.549166 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-content/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.657444 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-utilities/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.696796 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/extract-content/0.log" Feb 21 22:52:49 crc kubenswrapper[4717]: I0221 22:52:49.879468 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/util/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.091079 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/util/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.116607 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/pull/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.120158 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/pull/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.267277 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dljqt_175e04b6-74b9-44e7-94ce-950ffeb16677/registry-server/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.327819 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/util/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.356476 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/extract/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.371171 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecazqzrp_eb912310-4451-4fc6-b05b-675b6bcfff59/pull/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.492005 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kd64g_5dd8c724-ea15-4c93-b15b-cfe5b39d9c1d/marketplace-operator/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.571230 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-utilities/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.749971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-content/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.795655 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-utilities/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.796539 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-content/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.935962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-utilities/0.log" Feb 21 22:52:50 crc kubenswrapper[4717]: I0221 22:52:50.993045 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/extract-content/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.134772 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dhgrt_57258517-86da-432f-8123-e3af4325d01a/registry-server/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.148252 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-utilities/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.322169 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-content/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.360646 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-utilities/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.362088 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-content/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.531344 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-content/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.613345 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/extract-utilities/0.log" Feb 21 22:52:51 crc kubenswrapper[4717]: I0221 22:52:51.901220 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pbvfd_38f581da-ba2e-469c-b3eb-745f5b14190e/registry-server/0.log" Feb 21 22:53:09 crc kubenswrapper[4717]: I0221 22:53:09.062473 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:53:09 crc kubenswrapper[4717]: I0221 22:53:09.063245 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:53:12 crc kubenswrapper[4717]: E0221 22:53:12.466477 4717 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.65:57408->38.102.83.65:45985: write tcp 38.102.83.65:57408->38.102.83.65:45985: write: broken pipe Feb 21 22:53:39 crc kubenswrapper[4717]: I0221 22:53:39.062260 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:53:39 crc kubenswrapper[4717]: I0221 22:53:39.062967 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:54:09 crc kubenswrapper[4717]: I0221 22:54:09.062626 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:54:09 crc kubenswrapper[4717]: I0221 22:54:09.063313 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:54:09 crc kubenswrapper[4717]: I0221 22:54:09.063381 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:54:09 crc kubenswrapper[4717]: I0221 22:54:09.064503 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"984daa8c5a04e9002764dbe7ebc03997ead5f21f5959852a7982e44ec2f84886"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:54:09 crc kubenswrapper[4717]: I0221 22:54:09.064687 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://984daa8c5a04e9002764dbe7ebc03997ead5f21f5959852a7982e44ec2f84886" gracePeriod=600 Feb 21 22:54:10 crc kubenswrapper[4717]: I0221 22:54:10.079822 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="984daa8c5a04e9002764dbe7ebc03997ead5f21f5959852a7982e44ec2f84886" exitCode=0 Feb 21 22:54:10 crc kubenswrapper[4717]: I0221 22:54:10.079896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"984daa8c5a04e9002764dbe7ebc03997ead5f21f5959852a7982e44ec2f84886"} Feb 21 22:54:10 crc kubenswrapper[4717]: I0221 22:54:10.080318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerStarted","Data":"cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35"} Feb 21 22:54:10 crc kubenswrapper[4717]: I0221 22:54:10.080336 4717 scope.go:117] "RemoveContainer" containerID="210aed4f683b7243fec68054bd5440574f7bf2aaf845f83dd036172b4ebe33ac" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.237076 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6j2gv"] Feb 21 22:54:18 crc kubenswrapper[4717]: E0221 22:54:18.238223 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c9c1d7-d921-41d4-9775-3823dc4f5422" containerName="container-00" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.238245 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c9c1d7-d921-41d4-9775-3823dc4f5422" containerName="container-00" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.238647 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c9c1d7-d921-41d4-9775-3823dc4f5422" containerName="container-00" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.248595 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.281893 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j2gv"] Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.350704 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-catalog-content\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.351073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4nq\" (UniqueName: \"kubernetes.io/projected/8cab8622-e5ae-4002-8837-22e125695a77-kube-api-access-gs4nq\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.351245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-utilities\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.453166 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-utilities\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.453281 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-catalog-content\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.453367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4nq\" (UniqueName: \"kubernetes.io/projected/8cab8622-e5ae-4002-8837-22e125695a77-kube-api-access-gs4nq\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.454311 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-catalog-content\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.454465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-utilities\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.475575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4nq\" (UniqueName: \"kubernetes.io/projected/8cab8622-e5ae-4002-8837-22e125695a77-kube-api-access-gs4nq\") pod \"redhat-marketplace-6j2gv\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:18 crc kubenswrapper[4717]: I0221 22:54:18.595462 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:19 crc kubenswrapper[4717]: I0221 22:54:19.094850 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j2gv"] Feb 21 22:54:19 crc kubenswrapper[4717]: I0221 22:54:19.203521 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerStarted","Data":"3f69ba38bf491d6e280c21f20eaa0131f97a47fb2b7a1375f151936d497d5081"} Feb 21 22:54:20 crc kubenswrapper[4717]: I0221 22:54:20.217557 4717 generic.go:334] "Generic (PLEG): container finished" podID="8cab8622-e5ae-4002-8837-22e125695a77" containerID="6122a8d2ee3bd5ac664a48c2638367207c29a87e4a19441f71f8a0cd88453c97" exitCode=0 Feb 21 22:54:20 crc kubenswrapper[4717]: I0221 22:54:20.217666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerDied","Data":"6122a8d2ee3bd5ac664a48c2638367207c29a87e4a19441f71f8a0cd88453c97"} Feb 21 22:54:20 crc kubenswrapper[4717]: I0221 22:54:20.221326 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 21 22:54:21 crc kubenswrapper[4717]: I0221 22:54:21.229884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerStarted","Data":"bad8f03c04669573c2675a4f48cb10e963e40b860af58e2d4e20445033445443"} Feb 21 22:54:22 crc kubenswrapper[4717]: I0221 22:54:22.279377 4717 generic.go:334] "Generic (PLEG): container finished" podID="8cab8622-e5ae-4002-8837-22e125695a77" containerID="bad8f03c04669573c2675a4f48cb10e963e40b860af58e2d4e20445033445443" exitCode=0 Feb 21 22:54:22 crc kubenswrapper[4717]: I0221 22:54:22.279567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerDied","Data":"bad8f03c04669573c2675a4f48cb10e963e40b860af58e2d4e20445033445443"} Feb 21 22:54:23 crc kubenswrapper[4717]: I0221 22:54:23.307196 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerStarted","Data":"eaac918a61f418c04b4c4e315a59c335b64df903a750df38ae6ed41d721d943f"} Feb 21 22:54:23 crc kubenswrapper[4717]: I0221 22:54:23.344118 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6j2gv" podStartSLOduration=2.895361771 podStartE2EDuration="5.344092338s" podCreationTimestamp="2026-02-21 22:54:18 +0000 UTC" firstStartedPulling="2026-02-21 22:54:20.220713347 +0000 UTC m=+4075.002247009" lastFinishedPulling="2026-02-21 22:54:22.669443954 +0000 UTC m=+4077.450977576" observedRunningTime="2026-02-21 22:54:23.33535637 +0000 UTC m=+4078.116890002" watchObservedRunningTime="2026-02-21 22:54:23.344092338 +0000 UTC m=+4078.125625970" Feb 21 22:54:28 crc kubenswrapper[4717]: I0221 22:54:28.595594 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:28 crc kubenswrapper[4717]: I0221 22:54:28.597538 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:28 crc kubenswrapper[4717]: I0221 22:54:28.687356 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:29 crc kubenswrapper[4717]: I0221 22:54:29.455037 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:29 crc kubenswrapper[4717]: I0221 22:54:29.526066 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j2gv"] Feb 21 22:54:31 crc kubenswrapper[4717]: I0221 22:54:31.415993 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6j2gv" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="registry-server" containerID="cri-o://eaac918a61f418c04b4c4e315a59c335b64df903a750df38ae6ed41d721d943f" gracePeriod=2 Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.428067 4717 generic.go:334] "Generic (PLEG): container finished" podID="8cab8622-e5ae-4002-8837-22e125695a77" containerID="eaac918a61f418c04b4c4e315a59c335b64df903a750df38ae6ed41d721d943f" exitCode=0 Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.428281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerDied","Data":"eaac918a61f418c04b4c4e315a59c335b64df903a750df38ae6ed41d721d943f"} Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.428459 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j2gv" event={"ID":"8cab8622-e5ae-4002-8837-22e125695a77","Type":"ContainerDied","Data":"3f69ba38bf491d6e280c21f20eaa0131f97a47fb2b7a1375f151936d497d5081"} Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.428478 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f69ba38bf491d6e280c21f20eaa0131f97a47fb2b7a1375f151936d497d5081" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.500332 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.569578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-catalog-content\") pod \"8cab8622-e5ae-4002-8837-22e125695a77\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.569654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-utilities\") pod \"8cab8622-e5ae-4002-8837-22e125695a77\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.569712 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs4nq\" (UniqueName: \"kubernetes.io/projected/8cab8622-e5ae-4002-8837-22e125695a77-kube-api-access-gs4nq\") pod \"8cab8622-e5ae-4002-8837-22e125695a77\" (UID: \"8cab8622-e5ae-4002-8837-22e125695a77\") " Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.570630 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-utilities" (OuterVolumeSpecName: "utilities") pod "8cab8622-e5ae-4002-8837-22e125695a77" (UID: "8cab8622-e5ae-4002-8837-22e125695a77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.576157 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cab8622-e5ae-4002-8837-22e125695a77-kube-api-access-gs4nq" (OuterVolumeSpecName: "kube-api-access-gs4nq") pod "8cab8622-e5ae-4002-8837-22e125695a77" (UID: "8cab8622-e5ae-4002-8837-22e125695a77"). InnerVolumeSpecName "kube-api-access-gs4nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.593120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cab8622-e5ae-4002-8837-22e125695a77" (UID: "8cab8622-e5ae-4002-8837-22e125695a77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.671564 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.671596 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cab8622-e5ae-4002-8837-22e125695a77-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:54:32 crc kubenswrapper[4717]: I0221 22:54:32.671608 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs4nq\" (UniqueName: \"kubernetes.io/projected/8cab8622-e5ae-4002-8837-22e125695a77-kube-api-access-gs4nq\") on node \"crc\" DevicePath \"\"" Feb 21 22:54:33 crc kubenswrapper[4717]: I0221 22:54:33.439082 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j2gv" Feb 21 22:54:33 crc kubenswrapper[4717]: I0221 22:54:33.480465 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j2gv"] Feb 21 22:54:33 crc kubenswrapper[4717]: I0221 22:54:33.490544 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j2gv"] Feb 21 22:54:33 crc kubenswrapper[4717]: I0221 22:54:33.995071 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cab8622-e5ae-4002-8837-22e125695a77" path="/var/lib/kubelet/pods/8cab8622-e5ae-4002-8837-22e125695a77/volumes" Feb 21 22:54:37 crc kubenswrapper[4717]: I0221 22:54:37.517978 4717 generic.go:334] "Generic (PLEG): container finished" podID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerID="3c403ddffa24a287cd6beaab0a626a78ef33b6335cadfaf128af91fb15f65c6f" exitCode=0 Feb 21 22:54:37 crc kubenswrapper[4717]: I0221 22:54:37.518132 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-987jj/must-gather-x2sgd" event={"ID":"c4ca7e21-366d-48a0-85c5-fae13f9026d9","Type":"ContainerDied","Data":"3c403ddffa24a287cd6beaab0a626a78ef33b6335cadfaf128af91fb15f65c6f"} Feb 21 22:54:37 crc kubenswrapper[4717]: I0221 22:54:37.519813 4717 scope.go:117] "RemoveContainer" containerID="3c403ddffa24a287cd6beaab0a626a78ef33b6335cadfaf128af91fb15f65c6f" Feb 21 22:54:38 crc kubenswrapper[4717]: I0221 22:54:38.185738 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-987jj_must-gather-x2sgd_c4ca7e21-366d-48a0-85c5-fae13f9026d9/gather/0.log" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.226600 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-987jj/must-gather-x2sgd"] Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.229163 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-987jj/must-gather-x2sgd" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="copy" containerID="cri-o://6fe2d14c4b886ecd9edcb94bd6a903f5dbb1aa488bb53478950a35183d87a347" gracePeriod=2 Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.237951 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-987jj/must-gather-x2sgd"] Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.667070 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-987jj_must-gather-x2sgd_c4ca7e21-366d-48a0-85c5-fae13f9026d9/copy/0.log" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.667641 4717 generic.go:334] "Generic (PLEG): container finished" podID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerID="6fe2d14c4b886ecd9edcb94bd6a903f5dbb1aa488bb53478950a35183d87a347" exitCode=143 Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.667677 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="702ebee2ba968b5cafdf752197b26902310784554404162e091662f65f6a8910" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.689119 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-987jj_must-gather-x2sgd_c4ca7e21-366d-48a0-85c5-fae13f9026d9/copy/0.log" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.689497 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.769253 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4ca7e21-366d-48a0-85c5-fae13f9026d9-must-gather-output\") pod \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.769633 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqmrz\" (UniqueName: \"kubernetes.io/projected/c4ca7e21-366d-48a0-85c5-fae13f9026d9-kube-api-access-jqmrz\") pod \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\" (UID: \"c4ca7e21-366d-48a0-85c5-fae13f9026d9\") " Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.776111 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ca7e21-366d-48a0-85c5-fae13f9026d9-kube-api-access-jqmrz" (OuterVolumeSpecName: "kube-api-access-jqmrz") pod "c4ca7e21-366d-48a0-85c5-fae13f9026d9" (UID: "c4ca7e21-366d-48a0-85c5-fae13f9026d9"). InnerVolumeSpecName "kube-api-access-jqmrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.872820 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqmrz\" (UniqueName: \"kubernetes.io/projected/c4ca7e21-366d-48a0-85c5-fae13f9026d9-kube-api-access-jqmrz\") on node \"crc\" DevicePath \"\"" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.922249 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ca7e21-366d-48a0-85c5-fae13f9026d9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c4ca7e21-366d-48a0-85c5-fae13f9026d9" (UID: "c4ca7e21-366d-48a0-85c5-fae13f9026d9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.974625 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c4ca7e21-366d-48a0-85c5-fae13f9026d9-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 21 22:54:49 crc kubenswrapper[4717]: I0221 22:54:49.985388 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" path="/var/lib/kubelet/pods/c4ca7e21-366d-48a0-85c5-fae13f9026d9/volumes" Feb 21 22:54:50 crc kubenswrapper[4717]: I0221 22:54:50.675656 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-987jj/must-gather-x2sgd" Feb 21 22:55:37 crc kubenswrapper[4717]: I0221 22:55:37.450165 4717 scope.go:117] "RemoveContainer" containerID="c3ee99772047ead465663eba81a7a3eaf8eedc6c0c8ef21f30401adbe3d87357" Feb 21 22:55:37 crc kubenswrapper[4717]: I0221 22:55:37.484937 4717 scope.go:117] "RemoveContainer" containerID="6fe2d14c4b886ecd9edcb94bd6a903f5dbb1aa488bb53478950a35183d87a347" Feb 21 22:55:37 crc kubenswrapper[4717]: I0221 22:55:37.527126 4717 scope.go:117] "RemoveContainer" containerID="3c403ddffa24a287cd6beaab0a626a78ef33b6335cadfaf128af91fb15f65c6f" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.016739 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6p7nf"] Feb 21 22:55:50 crc kubenswrapper[4717]: E0221 22:55:50.018159 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="extract-content" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018183 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="extract-content" Feb 21 22:55:50 crc kubenswrapper[4717]: E0221 22:55:50.018228 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="gather" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018237 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="gather" Feb 21 22:55:50 crc kubenswrapper[4717]: E0221 22:55:50.018254 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="extract-utilities" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018262 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="extract-utilities" Feb 21 22:55:50 crc kubenswrapper[4717]: E0221 22:55:50.018306 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="registry-server" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018317 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="registry-server" Feb 21 22:55:50 crc kubenswrapper[4717]: E0221 22:55:50.018335 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="copy" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018345 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="copy" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018691 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="copy" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018732 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ca7e21-366d-48a0-85c5-fae13f9026d9" containerName="gather" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.018766 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cab8622-e5ae-4002-8837-22e125695a77" containerName="registry-server" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.021014 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.027519 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p7nf"] Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.137444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-utilities\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.137517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-catalog-content\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.137707 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x4w4\" (UniqueName: \"kubernetes.io/projected/8414f808-c789-4d49-b16e-fbd4dfff5261-kube-api-access-6x4w4\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.239272 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x4w4\" (UniqueName: \"kubernetes.io/projected/8414f808-c789-4d49-b16e-fbd4dfff5261-kube-api-access-6x4w4\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.239399 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-utilities\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.239431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-catalog-content\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.240121 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-catalog-content\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.240245 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-utilities\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.265082 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x4w4\" (UniqueName: \"kubernetes.io/projected/8414f808-c789-4d49-b16e-fbd4dfff5261-kube-api-access-6x4w4\") pod \"redhat-operators-6p7nf\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.379733 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:55:50 crc kubenswrapper[4717]: I0221 22:55:50.643455 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p7nf"] Feb 21 22:55:51 crc kubenswrapper[4717]: I0221 22:55:51.348356 4717 generic.go:334] "Generic (PLEG): container finished" podID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerID="88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4" exitCode=0 Feb 21 22:55:51 crc kubenswrapper[4717]: I0221 22:55:51.348466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerDied","Data":"88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4"} Feb 21 22:55:51 crc kubenswrapper[4717]: I0221 22:55:51.348927 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerStarted","Data":"e1b01fe253f3948174106dd7cd93a3bcd3076921e41cad140477b474b33593a8"} Feb 21 22:55:52 crc kubenswrapper[4717]: I0221 22:55:52.365927 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerStarted","Data":"695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7"} Feb 21 22:55:53 crc kubenswrapper[4717]: I0221 22:55:53.381638 4717 generic.go:334] "Generic (PLEG): container finished" podID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerID="695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7" exitCode=0 Feb 21 22:55:53 crc kubenswrapper[4717]: I0221 22:55:53.383755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerDied","Data":"695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7"} Feb 21 22:55:54 crc kubenswrapper[4717]: I0221 22:55:54.391536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerStarted","Data":"5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad"} Feb 21 22:55:54 crc kubenswrapper[4717]: I0221 22:55:54.415280 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6p7nf" podStartSLOduration=2.968143816 podStartE2EDuration="5.415251655s" podCreationTimestamp="2026-02-21 22:55:49 +0000 UTC" firstStartedPulling="2026-02-21 22:55:51.350566266 +0000 UTC m=+4166.132099888" lastFinishedPulling="2026-02-21 22:55:53.797674105 +0000 UTC m=+4168.579207727" observedRunningTime="2026-02-21 22:55:54.40832995 +0000 UTC m=+4169.189863612" watchObservedRunningTime="2026-02-21 22:55:54.415251655 +0000 UTC m=+4169.196785317" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.397393 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rbpds"] Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.402437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.414716 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbpds"] Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.594549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-utilities\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.594674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xrq\" (UniqueName: \"kubernetes.io/projected/4c52468a-a653-40cd-8b18-10ba07cc94c5-kube-api-access-j2xrq\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.595380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-catalog-content\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.698074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xrq\" (UniqueName: \"kubernetes.io/projected/4c52468a-a653-40cd-8b18-10ba07cc94c5-kube-api-access-j2xrq\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.698174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-catalog-content\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.698349 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-utilities\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.699000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-utilities\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.699020 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-catalog-content\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.731581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xrq\" (UniqueName: \"kubernetes.io/projected/4c52468a-a653-40cd-8b18-10ba07cc94c5-kube-api-access-j2xrq\") pod \"community-operators-rbpds\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:57 crc kubenswrapper[4717]: I0221 22:55:57.743398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:55:58 crc kubenswrapper[4717]: I0221 22:55:58.878594 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbpds"] Feb 21 22:55:58 crc kubenswrapper[4717]: W0221 22:55:58.882189 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c52468a_a653_40cd_8b18_10ba07cc94c5.slice/crio-d3d1b63bede768249aecc28f7b78c8329ff8255eba2c255638cadc0e67b36988 WatchSource:0}: Error finding container d3d1b63bede768249aecc28f7b78c8329ff8255eba2c255638cadc0e67b36988: Status 404 returned error can't find the container with id d3d1b63bede768249aecc28f7b78c8329ff8255eba2c255638cadc0e67b36988 Feb 21 22:55:59 crc kubenswrapper[4717]: I0221 22:55:59.455161 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerID="a45b8a58be5ddf67d36ce3874cfc97582592664ca9e59868dac90b49735f2adc" exitCode=0 Feb 21 22:55:59 crc kubenswrapper[4717]: I0221 22:55:59.455247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerDied","Data":"a45b8a58be5ddf67d36ce3874cfc97582592664ca9e59868dac90b49735f2adc"} Feb 21 22:55:59 crc kubenswrapper[4717]: I0221 22:55:59.455622 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerStarted","Data":"d3d1b63bede768249aecc28f7b78c8329ff8255eba2c255638cadc0e67b36988"} Feb 21 22:56:00 crc kubenswrapper[4717]: I0221 22:56:00.381130 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:56:00 crc kubenswrapper[4717]: I0221 22:56:00.381532 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:56:00 crc kubenswrapper[4717]: I0221 22:56:00.469618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerStarted","Data":"81d42d6348c5aac40ec0392ce655cf105fca9850b1154e49bc6684651051f3a7"} Feb 21 22:56:01 crc kubenswrapper[4717]: I0221 22:56:01.440214 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6p7nf" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="registry-server" probeResult="failure" output=< Feb 21 22:56:01 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Feb 21 22:56:01 crc kubenswrapper[4717]: > Feb 21 22:56:01 crc kubenswrapper[4717]: I0221 22:56:01.486084 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerID="81d42d6348c5aac40ec0392ce655cf105fca9850b1154e49bc6684651051f3a7" exitCode=0 Feb 21 22:56:01 crc kubenswrapper[4717]: I0221 22:56:01.486168 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerDied","Data":"81d42d6348c5aac40ec0392ce655cf105fca9850b1154e49bc6684651051f3a7"} Feb 21 22:56:02 crc kubenswrapper[4717]: I0221 22:56:02.528337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerStarted","Data":"292ac98d3f90d02cb52dcf65842e141c9c36640a3c7588e6df14c0ba80f23321"} Feb 21 22:56:02 crc kubenswrapper[4717]: I0221 22:56:02.563074 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rbpds" podStartSLOduration=3.149017916 podStartE2EDuration="5.563049004s" podCreationTimestamp="2026-02-21 22:55:57 +0000 UTC" firstStartedPulling="2026-02-21 22:55:59.458097173 +0000 UTC m=+4174.239630805" lastFinishedPulling="2026-02-21 22:56:01.872128271 +0000 UTC m=+4176.653661893" observedRunningTime="2026-02-21 22:56:02.551703184 +0000 UTC m=+4177.333236846" watchObservedRunningTime="2026-02-21 22:56:02.563049004 +0000 UTC m=+4177.344582666" Feb 21 22:56:07 crc kubenswrapper[4717]: I0221 22:56:07.744191 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:56:07 crc kubenswrapper[4717]: I0221 22:56:07.744981 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:56:08 crc kubenswrapper[4717]: I0221 22:56:08.772500 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:56:08 crc kubenswrapper[4717]: I0221 22:56:08.844117 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:56:09 crc kubenswrapper[4717]: I0221 22:56:09.018386 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbpds"] Feb 21 22:56:09 crc kubenswrapper[4717]: I0221 22:56:09.062455 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:56:09 crc kubenswrapper[4717]: I0221 22:56:09.062557 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:56:10 crc kubenswrapper[4717]: I0221 22:56:10.633603 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rbpds" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="registry-server" containerID="cri-o://292ac98d3f90d02cb52dcf65842e141c9c36640a3c7588e6df14c0ba80f23321" gracePeriod=2 Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.490797 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.608507 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.650310 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerID="292ac98d3f90d02cb52dcf65842e141c9c36640a3c7588e6df14c0ba80f23321" exitCode=0 Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.650686 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerDied","Data":"292ac98d3f90d02cb52dcf65842e141c9c36640a3c7588e6df14c0ba80f23321"} Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.819463 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.944720 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-utilities\") pod \"4c52468a-a653-40cd-8b18-10ba07cc94c5\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.945213 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2xrq\" (UniqueName: \"kubernetes.io/projected/4c52468a-a653-40cd-8b18-10ba07cc94c5-kube-api-access-j2xrq\") pod \"4c52468a-a653-40cd-8b18-10ba07cc94c5\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.945390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-catalog-content\") pod \"4c52468a-a653-40cd-8b18-10ba07cc94c5\" (UID: \"4c52468a-a653-40cd-8b18-10ba07cc94c5\") " Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.946464 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-utilities" (OuterVolumeSpecName: "utilities") pod "4c52468a-a653-40cd-8b18-10ba07cc94c5" (UID: "4c52468a-a653-40cd-8b18-10ba07cc94c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:56:11 crc kubenswrapper[4717]: I0221 22:56:11.957556 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c52468a-a653-40cd-8b18-10ba07cc94c5-kube-api-access-j2xrq" (OuterVolumeSpecName: "kube-api-access-j2xrq") pod "4c52468a-a653-40cd-8b18-10ba07cc94c5" (UID: "4c52468a-a653-40cd-8b18-10ba07cc94c5"). InnerVolumeSpecName "kube-api-access-j2xrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.037797 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c52468a-a653-40cd-8b18-10ba07cc94c5" (UID: "4c52468a-a653-40cd-8b18-10ba07cc94c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.047492 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2xrq\" (UniqueName: \"kubernetes.io/projected/4c52468a-a653-40cd-8b18-10ba07cc94c5-kube-api-access-j2xrq\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.047532 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.047546 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c52468a-a653-40cd-8b18-10ba07cc94c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.667702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbpds" event={"ID":"4c52468a-a653-40cd-8b18-10ba07cc94c5","Type":"ContainerDied","Data":"d3d1b63bede768249aecc28f7b78c8329ff8255eba2c255638cadc0e67b36988"} Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.667767 4717 scope.go:117] "RemoveContainer" containerID="292ac98d3f90d02cb52dcf65842e141c9c36640a3c7588e6df14c0ba80f23321" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.667914 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbpds" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.704576 4717 scope.go:117] "RemoveContainer" containerID="81d42d6348c5aac40ec0392ce655cf105fca9850b1154e49bc6684651051f3a7" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.727718 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbpds"] Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.738591 4717 scope.go:117] "RemoveContainer" containerID="a45b8a58be5ddf67d36ce3874cfc97582592664ca9e59868dac90b49735f2adc" Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.748833 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rbpds"] Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.825959 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6p7nf"] Feb 21 22:56:12 crc kubenswrapper[4717]: I0221 22:56:12.826297 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6p7nf" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="registry-server" containerID="cri-o://5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad" gracePeriod=2 Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.354275 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.479142 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x4w4\" (UniqueName: \"kubernetes.io/projected/8414f808-c789-4d49-b16e-fbd4dfff5261-kube-api-access-6x4w4\") pod \"8414f808-c789-4d49-b16e-fbd4dfff5261\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.479374 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-catalog-content\") pod \"8414f808-c789-4d49-b16e-fbd4dfff5261\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.479415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-utilities\") pod \"8414f808-c789-4d49-b16e-fbd4dfff5261\" (UID: \"8414f808-c789-4d49-b16e-fbd4dfff5261\") " Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.480554 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-utilities" (OuterVolumeSpecName: "utilities") pod "8414f808-c789-4d49-b16e-fbd4dfff5261" (UID: "8414f808-c789-4d49-b16e-fbd4dfff5261"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.488606 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8414f808-c789-4d49-b16e-fbd4dfff5261-kube-api-access-6x4w4" (OuterVolumeSpecName: "kube-api-access-6x4w4") pod "8414f808-c789-4d49-b16e-fbd4dfff5261" (UID: "8414f808-c789-4d49-b16e-fbd4dfff5261"). InnerVolumeSpecName "kube-api-access-6x4w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.582569 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.582621 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x4w4\" (UniqueName: \"kubernetes.io/projected/8414f808-c789-4d49-b16e-fbd4dfff5261-kube-api-access-6x4w4\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.663771 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8414f808-c789-4d49-b16e-fbd4dfff5261" (UID: "8414f808-c789-4d49-b16e-fbd4dfff5261"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.684505 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8414f808-c789-4d49-b16e-fbd4dfff5261-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.685856 4717 generic.go:334] "Generic (PLEG): container finished" podID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerID="5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad" exitCode=0 Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.685930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerDied","Data":"5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad"} Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.686005 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p7nf" event={"ID":"8414f808-c789-4d49-b16e-fbd4dfff5261","Type":"ContainerDied","Data":"e1b01fe253f3948174106dd7cd93a3bcd3076921e41cad140477b474b33593a8"} Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.686013 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p7nf" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.686039 4717 scope.go:117] "RemoveContainer" containerID="5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.717594 4717 scope.go:117] "RemoveContainer" containerID="695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.757543 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6p7nf"] Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.775248 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6p7nf"] Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.822528 4717 scope.go:117] "RemoveContainer" containerID="88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.855223 4717 scope.go:117] "RemoveContainer" containerID="5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad" Feb 21 22:56:13 crc kubenswrapper[4717]: E0221 22:56:13.855639 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad\": container with ID starting with 5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad not found: ID does not exist" containerID="5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.855678 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad"} err="failed to get container status \"5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad\": rpc error: code = NotFound desc = could not find container \"5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad\": container with ID starting with 5e8e1473835cc6f439d30588d3fd4cabf71d1de18d4876cb433ad7d0a40405ad not found: ID does not exist" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.855704 4717 scope.go:117] "RemoveContainer" containerID="695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7" Feb 21 22:56:13 crc kubenswrapper[4717]: E0221 22:56:13.855988 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7\": container with ID starting with 695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7 not found: ID does not exist" containerID="695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.856020 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7"} err="failed to get container status \"695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7\": rpc error: code = NotFound desc = could not find container \"695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7\": container with ID starting with 695c0f499b4eb9e368c5c387760ae7026ada6d6f13ddf04ba916c0f47c2cc6a7 not found: ID does not exist" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.856040 4717 scope.go:117] "RemoveContainer" containerID="88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4" Feb 21 22:56:13 crc kubenswrapper[4717]: E0221 22:56:13.856547 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4\": container with ID starting with 88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4 not found: ID does not exist" containerID="88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.856579 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4"} err="failed to get container status \"88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4\": rpc error: code = NotFound desc = could not find container \"88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4\": container with ID starting with 88bcf22b3b1680e94b76a7518a9903a0c3e338369d56f1f8cdecbee515f225b4 not found: ID does not exist" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.994475 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" path="/var/lib/kubelet/pods/4c52468a-a653-40cd-8b18-10ba07cc94c5/volumes" Feb 21 22:56:13 crc kubenswrapper[4717]: I0221 22:56:13.995911 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" path="/var/lib/kubelet/pods/8414f808-c789-4d49-b16e-fbd4dfff5261/volumes" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.748602 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r6rx9"] Feb 21 22:56:38 crc kubenswrapper[4717]: E0221 22:56:38.750166 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="extract-utilities" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.750202 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="extract-utilities" Feb 21 22:56:38 crc kubenswrapper[4717]: E0221 22:56:38.750244 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="registry-server" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.750262 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="registry-server" Feb 21 22:56:38 crc kubenswrapper[4717]: E0221 22:56:38.750354 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="registry-server" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.750375 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="registry-server" Feb 21 22:56:38 crc kubenswrapper[4717]: E0221 22:56:38.750429 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="extract-content" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.750447 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="extract-content" Feb 21 22:56:38 crc kubenswrapper[4717]: E0221 22:56:38.750492 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="extract-content" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.750508 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="extract-content" Feb 21 22:56:38 crc kubenswrapper[4717]: E0221 22:56:38.750556 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="extract-utilities" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.750574 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="extract-utilities" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.751083 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8414f808-c789-4d49-b16e-fbd4dfff5261" containerName="registry-server" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.751131 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c52468a-a653-40cd-8b18-10ba07cc94c5" containerName="registry-server" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.754492 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.774084 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6rx9"] Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.867274 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-utilities\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.867771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-catalog-content\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.868150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx2kf\" (UniqueName: \"kubernetes.io/projected/74638657-6654-4c28-81ff-1ea91002ef1e-kube-api-access-tx2kf\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.971334 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-catalog-content\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.971467 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx2kf\" (UniqueName: \"kubernetes.io/projected/74638657-6654-4c28-81ff-1ea91002ef1e-kube-api-access-tx2kf\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.971588 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-utilities\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.972174 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-utilities\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:38 crc kubenswrapper[4717]: I0221 22:56:38.974365 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-catalog-content\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.006013 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx2kf\" (UniqueName: \"kubernetes.io/projected/74638657-6654-4c28-81ff-1ea91002ef1e-kube-api-access-tx2kf\") pod \"certified-operators-r6rx9\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.063568 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.063632 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.100853 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.564326 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6rx9"] Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.988432 4717 generic.go:334] "Generic (PLEG): container finished" podID="74638657-6654-4c28-81ff-1ea91002ef1e" containerID="fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807" exitCode=0 Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.989617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerDied","Data":"fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807"} Feb 21 22:56:39 crc kubenswrapper[4717]: I0221 22:56:39.989655 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerStarted","Data":"a86a5c7088191f88893d370e95e0f1e32e89969e2d38b6e70769c689286a5bb8"} Feb 21 22:56:40 crc kubenswrapper[4717]: I0221 22:56:40.998917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerStarted","Data":"1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b"} Feb 21 22:56:42 crc kubenswrapper[4717]: I0221 22:56:42.019329 4717 generic.go:334] "Generic (PLEG): container finished" podID="74638657-6654-4c28-81ff-1ea91002ef1e" containerID="1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b" exitCode=0 Feb 21 22:56:42 crc kubenswrapper[4717]: I0221 22:56:42.019411 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerDied","Data":"1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b"} Feb 21 22:56:43 crc kubenswrapper[4717]: I0221 22:56:43.033785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerStarted","Data":"3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28"} Feb 21 22:56:43 crc kubenswrapper[4717]: I0221 22:56:43.071438 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r6rx9" podStartSLOduration=2.630511608 podStartE2EDuration="5.071404718s" podCreationTimestamp="2026-02-21 22:56:38 +0000 UTC" firstStartedPulling="2026-02-21 22:56:39.991074276 +0000 UTC m=+4214.772607898" lastFinishedPulling="2026-02-21 22:56:42.431967386 +0000 UTC m=+4217.213501008" observedRunningTime="2026-02-21 22:56:43.065724443 +0000 UTC m=+4217.847258075" watchObservedRunningTime="2026-02-21 22:56:43.071404718 +0000 UTC m=+4217.852938380" Feb 21 22:56:49 crc kubenswrapper[4717]: I0221 22:56:49.101642 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:49 crc kubenswrapper[4717]: I0221 22:56:49.102171 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:49 crc kubenswrapper[4717]: I0221 22:56:49.180337 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:50 crc kubenswrapper[4717]: I0221 22:56:50.176615 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:50 crc kubenswrapper[4717]: I0221 22:56:50.245546 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6rx9"] Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.140968 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r6rx9" podUID="74638657-6654-4c28-81ff-1ea91002ef1e" containerName="registry-server" containerID="cri-o://3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28" gracePeriod=2 Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.637146 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.779987 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx2kf\" (UniqueName: \"kubernetes.io/projected/74638657-6654-4c28-81ff-1ea91002ef1e-kube-api-access-tx2kf\") pod \"74638657-6654-4c28-81ff-1ea91002ef1e\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.780640 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-utilities\") pod \"74638657-6654-4c28-81ff-1ea91002ef1e\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.782562 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-utilities" (OuterVolumeSpecName: "utilities") pod "74638657-6654-4c28-81ff-1ea91002ef1e" (UID: "74638657-6654-4c28-81ff-1ea91002ef1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.782726 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-catalog-content\") pod \"74638657-6654-4c28-81ff-1ea91002ef1e\" (UID: \"74638657-6654-4c28-81ff-1ea91002ef1e\") " Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.787264 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-utilities\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.789304 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74638657-6654-4c28-81ff-1ea91002ef1e-kube-api-access-tx2kf" (OuterVolumeSpecName: "kube-api-access-tx2kf") pod "74638657-6654-4c28-81ff-1ea91002ef1e" (UID: "74638657-6654-4c28-81ff-1ea91002ef1e"). InnerVolumeSpecName "kube-api-access-tx2kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.877295 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74638657-6654-4c28-81ff-1ea91002ef1e" (UID: "74638657-6654-4c28-81ff-1ea91002ef1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.889921 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx2kf\" (UniqueName: \"kubernetes.io/projected/74638657-6654-4c28-81ff-1ea91002ef1e-kube-api-access-tx2kf\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:52 crc kubenswrapper[4717]: I0221 22:56:52.889958 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74638657-6654-4c28-81ff-1ea91002ef1e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.156484 4717 generic.go:334] "Generic (PLEG): container finished" podID="74638657-6654-4c28-81ff-1ea91002ef1e" containerID="3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28" exitCode=0 Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.156532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerDied","Data":"3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28"} Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.156614 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6rx9" event={"ID":"74638657-6654-4c28-81ff-1ea91002ef1e","Type":"ContainerDied","Data":"a86a5c7088191f88893d370e95e0f1e32e89969e2d38b6e70769c689286a5bb8"} Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.156643 4717 scope.go:117] "RemoveContainer" containerID="3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.158096 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6rx9" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.184654 4717 scope.go:117] "RemoveContainer" containerID="1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.214327 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6rx9"] Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.230009 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r6rx9"] Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.708848 4717 scope.go:117] "RemoveContainer" containerID="fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.767498 4717 scope.go:117] "RemoveContainer" containerID="3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28" Feb 21 22:56:53 crc kubenswrapper[4717]: E0221 22:56:53.768375 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28\": container with ID starting with 3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28 not found: ID does not exist" containerID="3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.768426 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28"} err="failed to get container status \"3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28\": rpc error: code = NotFound desc = could not find container \"3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28\": container with ID starting with 3e57ca69b854100127956b625d5aa6a5326579c8fab3f92c908739d10f256a28 not found: ID does not exist" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.768463 4717 scope.go:117] "RemoveContainer" containerID="1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b" Feb 21 22:56:53 crc kubenswrapper[4717]: E0221 22:56:53.769114 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b\": container with ID starting with 1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b not found: ID does not exist" containerID="1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.769176 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b"} err="failed to get container status \"1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b\": rpc error: code = NotFound desc = could not find container \"1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b\": container with ID starting with 1c28def93f019a37e3c64eea9118ea03d18018e8e8567d38625d0ef897bc007b not found: ID does not exist" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.769214 4717 scope.go:117] "RemoveContainer" containerID="fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807" Feb 21 22:56:53 crc kubenswrapper[4717]: E0221 22:56:53.769673 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807\": container with ID starting with fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807 not found: ID does not exist" containerID="fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.769721 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807"} err="failed to get container status \"fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807\": rpc error: code = NotFound desc = could not find container \"fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807\": container with ID starting with fae110986f562e72c39f4782563a84a13d8c0e7b90a45ef1340243db12e80807 not found: ID does not exist" Feb 21 22:56:53 crc kubenswrapper[4717]: I0221 22:56:53.992795 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74638657-6654-4c28-81ff-1ea91002ef1e" path="/var/lib/kubelet/pods/74638657-6654-4c28-81ff-1ea91002ef1e/volumes" Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.063182 4717 patch_prober.go:28] interesting pod/machine-config-daemon-flt22 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.063775 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.063842 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-flt22" Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.064806 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35"} pod="openshift-machine-config-operator/machine-config-daemon-flt22" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.064933 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerName="machine-config-daemon" containerID="cri-o://cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" gracePeriod=600 Feb 21 22:57:09 crc kubenswrapper[4717]: E0221 22:57:09.197971 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.338631 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" exitCode=0 Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.338711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-flt22" event={"ID":"cc5eeb62-90d6-4f10-9b58-f147b23eb775","Type":"ContainerDied","Data":"cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35"} Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.339111 4717 scope.go:117] "RemoveContainer" containerID="984daa8c5a04e9002764dbe7ebc03997ead5f21f5959852a7982e44ec2f84886" Feb 21 22:57:09 crc kubenswrapper[4717]: I0221 22:57:09.339953 4717 scope.go:117] "RemoveContainer" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" Feb 21 22:57:09 crc kubenswrapper[4717]: E0221 22:57:09.340410 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:57:21 crc kubenswrapper[4717]: I0221 22:57:21.976049 4717 scope.go:117] "RemoveContainer" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" Feb 21 22:57:21 crc kubenswrapper[4717]: E0221 22:57:21.976896 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:57:36 crc kubenswrapper[4717]: I0221 22:57:36.977049 4717 scope.go:117] "RemoveContainer" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" Feb 21 22:57:36 crc kubenswrapper[4717]: E0221 22:57:36.978313 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:57:48 crc kubenswrapper[4717]: I0221 22:57:48.976994 4717 scope.go:117] "RemoveContainer" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" Feb 21 22:57:48 crc kubenswrapper[4717]: E0221 22:57:48.978214 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:58:02 crc kubenswrapper[4717]: I0221 22:58:02.976065 4717 scope.go:117] "RemoveContainer" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" Feb 21 22:58:02 crc kubenswrapper[4717]: E0221 22:58:02.977054 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" Feb 21 22:58:17 crc kubenswrapper[4717]: I0221 22:58:17.976477 4717 scope.go:117] "RemoveContainer" containerID="cce2a7f124d7516a154981ac0890e510e8ff6be01df9c5bc02a467d04942fc35" Feb 21 22:58:17 crc kubenswrapper[4717]: E0221 22:58:17.977368 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-flt22_openshift-machine-config-operator(cc5eeb62-90d6-4f10-9b58-f147b23eb775)\"" pod="openshift-machine-config-operator/machine-config-daemon-flt22" podUID="cc5eeb62-90d6-4f10-9b58-f147b23eb775" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515146434235024454 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015146434236017372 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146423323016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146423324015461 5ustar corecore